Node ID







Columns











Log Level





Log Marker







Class



















































node1 0.000ns 2025-09-30 05:45:00.397 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node2 4.000ms 2025-09-30 05:45:00.401 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node1 82.000ms 2025-09-30 05:45:00.479 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node1 97.000ms 2025-09-30 05:45:00.494 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 103.000ms 2025-09-30 05:45:00.500 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node2 119.000ms 2025-09-30 05:45:00.516 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 205.000ms 2025-09-30 05:45:00.602 4 INFO STARTUP <main> Browser: The following nodes [1] are set to run locally
node1 211.000ms 2025-09-30 05:45:00.608 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node1 223.000ms 2025-09-30 05:45:00.620 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 255.000ms 2025-09-30 05:45:00.652 4 INFO STARTUP <main> Browser: The following nodes [2] are set to run locally
node2 261.000ms 2025-09-30 05:45:00.658 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node2 273.000ms 2025-09-30 05:45:00.670 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 389.000ms 2025-09-30 05:45:00.786 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node3 478.000ms 2025-09-30 05:45:00.875 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node3 493.000ms 2025-09-30 05:45:00.890 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 611.000ms 2025-09-30 05:45:01.008 4 INFO STARTUP <main> Browser: The following nodes [3] are set to run locally
node3 618.000ms 2025-09-30 05:45:01.015 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node1 622.000ms 2025-09-30 05:45:01.019 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node1 623.000ms 2025-09-30 05:45:01.020 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node3 630.000ms 2025-09-30 05:45:01.027 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 722.000ms 2025-09-30 05:45:01.119 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node2 723.000ms 2025-09-30 05:45:01.120 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node0 788.000ms 2025-09-30 05:45:01.185 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node0 877.000ms 2025-09-30 05:45:01.274 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node0 893.000ms 2025-09-30 05:45:01.290 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 1.009s 2025-09-30 05:45:01.406 4 INFO STARTUP <main> Browser: The following nodes [0] are set to run locally
node0 1.015s 2025-09-30 05:45:01.412 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node0 1.027s 2025-09-30 05:45:01.424 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 1.063s 2025-09-30 05:45:01.460 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node3 1.064s 2025-09-30 05:45:01.461 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 1.112s 2025-09-30 05:45:01.509 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 1.205s 2025-09-30 05:45:01.602 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 1.221s 2025-09-30 05:45:01.618 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 1.342s 2025-09-30 05:45:01.739 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 1.350s 2025-09-30 05:45:01.747 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 1.363s 2025-09-30 05:45:01.760 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 1.450s 2025-09-30 05:45:01.847 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 827ms
node0 1.459s 2025-09-30 05:45:01.856 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node1 1.459s 2025-09-30 05:45:01.856 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node0 1.460s 2025-09-30 05:45:01.857 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node1 1.462s 2025-09-30 05:45:01.859 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 1.501s 2025-09-30 05:45:01.898 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node1 1.562s 2025-09-30 05:45:01.959 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node1 1.563s 2025-09-30 05:45:01.960 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node2 1.573s 2025-09-30 05:45:01.970 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 846ms
node2 1.588s 2025-09-30 05:45:01.985 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node2 1.592s 2025-09-30 05:45:01.989 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 1.636s 2025-09-30 05:45:02.033 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node2 1.698s 2025-09-30 05:45:02.095 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node2 1.699s 2025-09-30 05:45:02.096 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 1.816s 2025-09-30 05:45:02.213 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 1.817s 2025-09-30 05:45:02.214 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node3 2.173s 2025-09-30 05:45:02.570 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1107ms
node3 2.187s 2025-09-30 05:45:02.584 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node3 2.193s 2025-09-30 05:45:02.590 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 2.238s 2025-09-30 05:45:02.635 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node3 2.318s 2025-09-30 05:45:02.715 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node3 2.319s 2025-09-30 05:45:02.716 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node0 2.330s 2025-09-30 05:45:02.727 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 870ms
node0 2.339s 2025-09-30 05:45:02.736 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node0 2.342s 2025-09-30 05:45:02.739 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 2.385s 2025-09-30 05:45:02.782 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node0 2.450s 2025-09-30 05:45:02.847 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node0 2.451s 2025-09-30 05:45:02.848 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 2.877s 2025-09-30 05:45:03.274 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1060ms
node4 2.887s 2025-09-30 05:45:03.284 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 2.891s 2025-09-30 05:45:03.288 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 2.932s 2025-09-30 05:45:03.329 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 2.994s 2025-09-30 05:45:03.391 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 2.995s 2025-09-30 05:45:03.392 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node1 3.615s 2025-09-30 05:45:04.012 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node1 3.699s 2025-09-30 05:45:04.096 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 3.701s 2025-09-30 05:45:04.098 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node1 3.702s 2025-09-30 05:45:04.099 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 3.720s 2025-09-30 05:45:04.117 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node2 3.808s 2025-09-30 05:45:04.205 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 3.810s 2025-09-30 05:45:04.207 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node2 3.811s 2025-09-30 05:45:04.208 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 4.478s 2025-09-30 05:45:04.875 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node1 4.481s 2025-09-30 05:45:04.878 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.492s 2025-09-30 05:45:04.889 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node1 4.497s 2025-09-30 05:45:04.894 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node3 4.498s 2025-09-30 05:45:04.895 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node1 4.510s 2025-09-30 05:45:04.907 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.512s 2025-09-30 05:45:04.909 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 4.559s 2025-09-30 05:45:04.956 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 4.561s 2025-09-30 05:45:04.958 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node0 4.562s 2025-09-30 05:45:04.959 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 4.585s 2025-09-30 05:45:04.982 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 4.590s 2025-09-30 05:45:04.987 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node3 4.591s 2025-09-30 05:45:04.988 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 4.596s 2025-09-30 05:45:04.993 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.606s 2025-09-30 05:45:05.003 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node2 4.612s 2025-09-30 05:45:05.009 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node2 4.622s 2025-09-30 05:45:05.019 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.624s 2025-09-30 05:45:05.021 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.005s 2025-09-30 05:45:05.402 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 5.097s 2025-09-30 05:45:05.494 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.100s 2025-09-30 05:45:05.497 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node4 5.101s 2025-09-30 05:45:05.498 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 5.356s 2025-09-30 05:45:05.753 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 5.368s 2025-09-30 05:45:05.765 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node0 5.375s 2025-09-30 05:45:05.772 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node0 5.387s 2025-09-30 05:45:05.784 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 5.390s 2025-09-30 05:45:05.787 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 5.433s 2025-09-30 05:45:05.830 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 5.443s 2025-09-30 05:45:05.840 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node3 5.449s 2025-09-30 05:45:05.846 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node3 5.459s 2025-09-30 05:45:05.856 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 5.461s 2025-09-30 05:45:05.858 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 5.619s 2025-09-30 05:45:06.016 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26258681] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=115780, randomLong=7611177194791799395, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=11449, randomLong=3398887956124351831, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1026189, data=35, exception=null] OS Health Check Report - Complete (took 1019 ms)
node1 5.648s 2025-09-30 05:45:06.045 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node1 5.655s 2025-09-30 05:45:06.052 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node1 5.658s 2025-09-30 05:45:06.055 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node1 5.733s 2025-09-30 05:45:06.130 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "iHDorA==", "port": 30124 }, { "ipAddressV4": "CoAAeQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IjghRQ==", "port": 30125 }, { "ipAddressV4": "CoAAew==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IqzvOQ==", "port": 30126 }, { "ipAddressV4": "CoAAeA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "iHJx3w==", "port": 30127 }, { "ipAddressV4": "CoAAeg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I7wqdA==", "port": 30128 }, { "ipAddressV4": "CoAAfA==", "port": 30128 }] }] }
node2 5.737s 2025-09-30 05:45:06.134 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26358538] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=232670, randomLong=-6171700715785328117, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10200, randomLong=-7334580286669251943, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1158230, data=35, exception=null] OS Health Check Report - Complete (took 1022 ms)
node1 5.754s 2025-09-30 05:45:06.151 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/1/ConsistencyTestLog.csv
node1 5.755s 2025-09-30 05:45:06.152 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node2 5.768s 2025-09-30 05:45:06.165 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node1 5.769s 2025-09-30 05:45:06.166 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 2d79d2f0c62ef0c809e7eed726536d76d75b3f353607a4b44cd5ff1dcebcaf7aa8d42249f1a78abf58dc8d256014010a (root) ConsistencyTestingToolState / quality-pear-demise-train 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job
node2 5.776s 2025-09-30 05:45:06.173 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node2 5.778s 2025-09-30 05:45:06.175 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node2 5.856s 2025-09-30 05:45:06.253 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "iHDorA==", "port": 30124 }, { "ipAddressV4": "CoAAeQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IjghRQ==", "port": 30125 }, { "ipAddressV4": "CoAAew==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IqzvOQ==", "port": 30126 }, { "ipAddressV4": "CoAAeA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "iHJx3w==", "port": 30127 }, { "ipAddressV4": "CoAAeg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I7wqdA==", "port": 30128 }, { "ipAddressV4": "CoAAfA==", "port": 30128 }] }] }
node2 5.876s 2025-09-30 05:45:06.273 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/2/ConsistencyTestLog.csv
node2 5.877s 2025-09-30 05:45:06.274 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node2 5.891s 2025-09-30 05:45:06.288 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 2d79d2f0c62ef0c809e7eed726536d76d75b3f353607a4b44cd5ff1dcebcaf7aa8d42249f1a78abf58dc8d256014010a (root) ConsistencyTestingToolState / quality-pear-demise-train 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job
node4 5.932s 2025-09-30 05:45:06.329 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.943s 2025-09-30 05:45:06.340 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 5.949s 2025-09-30 05:45:06.346 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node4 5.961s 2025-09-30 05:45:06.358 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.963s 2025-09-30 05:45:06.360 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 5.983s 2025-09-30 05:45:06.380 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node1 5.986s 2025-09-30 05:45:06.383 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node1 5.991s 2025-09-30 05:45:06.388 47 INFO STARTUP <<start-node-1>> ConsistencyTestingToolMain: init called in Main for node 1.
node1 5.991s 2025-09-30 05:45:06.388 48 INFO STARTUP <<start-node-1>> SwirldsPlatform: Starting platform 1
node1 5.992s 2025-09-30 05:45:06.389 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node1 5.995s 2025-09-30 05:45:06.392 50 INFO STARTUP <<start-node-1>> CycleFinder: No cyclical back pressure detected in wiring model.
node1 5.996s 2025-09-30 05:45:06.393 51 INFO STARTUP <<start-node-1>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node1 5.996s 2025-09-30 05:45:06.393 52 INFO STARTUP <<start-node-1>> InputWireChecks: All input wires have been bound.
node1 5.998s 2025-09-30 05:45:06.395 53 WARN STARTUP <<start-node-1>> PcesFileTracker: No preconsensus event files available
node1 5.998s 2025-09-30 05:45:06.395 54 INFO STARTUP <<start-node-1>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node1 5.999s 2025-09-30 05:45:06.396 55 INFO STARTUP <<start-node-1>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node1 6.001s 2025-09-30 05:45:06.398 56 INFO STARTUP <<app: appMain 1>> ConsistencyTestingToolMain: run called in Main.
node1 6.003s 2025-09-30 05:45:06.400 57 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 182.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node1 6.008s 2025-09-30 05:45:06.405 58 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node2 6.109s 2025-09-30 05:45:06.506 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node2 6.114s 2025-09-30 05:45:06.511 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node2 6.118s 2025-09-30 05:45:06.515 47 INFO STARTUP <<start-node-2>> ConsistencyTestingToolMain: init called in Main for node 2.
node2 6.119s 2025-09-30 05:45:06.516 48 INFO STARTUP <<start-node-2>> SwirldsPlatform: Starting platform 2
node2 6.120s 2025-09-30 05:45:06.517 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node2 6.123s 2025-09-30 05:45:06.520 50 INFO STARTUP <<start-node-2>> CycleFinder: No cyclical back pressure detected in wiring model.
node2 6.124s 2025-09-30 05:45:06.521 51 INFO STARTUP <<start-node-2>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node2 6.124s 2025-09-30 05:45:06.521 52 INFO STARTUP <<start-node-2>> InputWireChecks: All input wires have been bound.
node2 6.126s 2025-09-30 05:45:06.523 53 WARN STARTUP <<start-node-2>> PcesFileTracker: No preconsensus event files available
node2 6.126s 2025-09-30 05:45:06.523 54 INFO STARTUP <<start-node-2>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node2 6.128s 2025-09-30 05:45:06.525 55 INFO STARTUP <<start-node-2>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node2 6.129s 2025-09-30 05:45:06.526 56 INFO STARTUP <<app: appMain 2>> ConsistencyTestingToolMain: run called in Main.
node2 6.131s 2025-09-30 05:45:06.528 57 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 183.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node2 6.136s 2025-09-30 05:45:06.533 58 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node0 6.518s 2025-09-30 05:45:06.915 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26183195] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=206140, randomLong=2410458313962146061, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=11220, randomLong=619050069164687378, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1041297, data=35, exception=null] OS Health Check Report - Complete (took 1025 ms)
node0 6.554s 2025-09-30 05:45:06.951 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node0 6.563s 2025-09-30 05:45:06.960 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node0 6.566s 2025-09-30 05:45:06.963 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node3 6.576s 2025-09-30 05:45:06.973 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26172597] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=179010, randomLong=-3710629693586774570, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10670, randomLong=4223879015442109100, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1236790, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms)
node3 6.608s 2025-09-30 05:45:07.005 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node3 6.616s 2025-09-30 05:45:07.013 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node3 6.620s 2025-09-30 05:45:07.017 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node0 6.649s 2025-09-30 05:45:07.046 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "iHDorA==", "port": 30124 }, { "ipAddressV4": "CoAAeQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IjghRQ==", "port": 30125 }, { "ipAddressV4": "CoAAew==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IqzvOQ==", "port": 30126 }, { "ipAddressV4": "CoAAeA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "iHJx3w==", "port": 30127 }, { "ipAddressV4": "CoAAeg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I7wqdA==", "port": 30128 }, { "ipAddressV4": "CoAAfA==", "port": 30128 }] }] }
node0 6.671s 2025-09-30 05:45:07.068 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/0/ConsistencyTestLog.csv
node0 6.672s 2025-09-30 05:45:07.069 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node0 6.687s 2025-09-30 05:45:07.084 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 2d79d2f0c62ef0c809e7eed726536d76d75b3f353607a4b44cd5ff1dcebcaf7aa8d42249f1a78abf58dc8d256014010a (root) ConsistencyTestingToolState / quality-pear-demise-train 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job
node3 6.702s 2025-09-30 05:45:07.099 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "iHDorA==", "port": 30124 }, { "ipAddressV4": "CoAAeQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IjghRQ==", "port": 30125 }, { "ipAddressV4": "CoAAew==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IqzvOQ==", "port": 30126 }, { "ipAddressV4": "CoAAeA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "iHJx3w==", "port": 30127 }, { "ipAddressV4": "CoAAeg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I7wqdA==", "port": 30128 }, { "ipAddressV4": "CoAAfA==", "port": 30128 }] }] }
node3 6.724s 2025-09-30 05:45:07.121 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/3/ConsistencyTestLog.csv
node3 6.725s 2025-09-30 05:45:07.122 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node3 6.740s 2025-09-30 05:45:07.137 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 2d79d2f0c62ef0c809e7eed726536d76d75b3f353607a4b44cd5ff1dcebcaf7aa8d42249f1a78abf58dc8d256014010a (root) ConsistencyTestingToolState / quality-pear-demise-train 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job
node0 6.914s 2025-09-30 05:45:07.311 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node0 6.918s 2025-09-30 05:45:07.315 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node0 6.923s 2025-09-30 05:45:07.320 47 INFO STARTUP <<start-node-0>> ConsistencyTestingToolMain: init called in Main for node 0.
node0 6.924s 2025-09-30 05:45:07.321 48 INFO STARTUP <<start-node-0>> SwirldsPlatform: Starting platform 0
node0 6.925s 2025-09-30 05:45:07.322 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node0 6.928s 2025-09-30 05:45:07.325 50 INFO STARTUP <<start-node-0>> CycleFinder: No cyclical back pressure detected in wiring model.
node0 6.929s 2025-09-30 05:45:07.326 51 INFO STARTUP <<start-node-0>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node0 6.930s 2025-09-30 05:45:07.327 52 INFO STARTUP <<start-node-0>> InputWireChecks: All input wires have been bound.
node0 6.931s 2025-09-30 05:45:07.328 53 WARN STARTUP <<start-node-0>> PcesFileTracker: No preconsensus event files available
node0 6.932s 2025-09-30 05:45:07.329 54 INFO STARTUP <<start-node-0>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node0 6.934s 2025-09-30 05:45:07.331 55 INFO STARTUP <<start-node-0>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node0 6.935s 2025-09-30 05:45:07.332 56 INFO STARTUP <<app: appMain 0>> ConsistencyTestingToolMain: run called in Main.
node0 6.939s 2025-09-30 05:45:07.336 57 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 196.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node0 6.945s 2025-09-30 05:45:07.342 58 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 5.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node3 6.948s 2025-09-30 05:45:07.345 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node3 6.953s 2025-09-30 05:45:07.350 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node3 6.958s 2025-09-30 05:45:07.355 47 INFO STARTUP <<start-node-3>> ConsistencyTestingToolMain: init called in Main for node 3.
node3 6.959s 2025-09-30 05:45:07.356 48 INFO STARTUP <<start-node-3>> SwirldsPlatform: Starting platform 3
node3 6.960s 2025-09-30 05:45:07.357 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node3 6.965s 2025-09-30 05:45:07.362 50 INFO STARTUP <<start-node-3>> CycleFinder: No cyclical back pressure detected in wiring model.
node3 6.966s 2025-09-30 05:45:07.363 51 INFO STARTUP <<start-node-3>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node3 6.966s 2025-09-30 05:45:07.363 52 INFO STARTUP <<start-node-3>> InputWireChecks: All input wires have been bound.
node3 6.968s 2025-09-30 05:45:07.365 53 WARN STARTUP <<start-node-3>> PcesFileTracker: No preconsensus event files available
node3 6.968s 2025-09-30 05:45:07.365 54 INFO STARTUP <<start-node-3>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node3 6.970s 2025-09-30 05:45:07.367 55 INFO STARTUP <<start-node-3>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node3 6.971s 2025-09-30 05:45:07.368 56 INFO STARTUP <<app: appMain 3>> ConsistencyTestingToolMain: run called in Main.
node3 6.974s 2025-09-30 05:45:07.371 57 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 176.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node3 6.979s 2025-09-30 05:45:07.376 58 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 7.077s 2025-09-30 05:45:07.474 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26282454] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=165530, randomLong=-4153673896778659863, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=14169, randomLong=-4077889052593070599, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1001019, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms)
node4 7.109s 2025-09-30 05:45:07.506 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 7.117s 2025-09-30 05:45:07.514 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 7.120s 2025-09-30 05:45:07.517 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 7.205s 2025-09-30 05:45:07.602 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "iHDorA==", "port": 30124 }, { "ipAddressV4": "CoAAeQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IjghRQ==", "port": 30125 }, { "ipAddressV4": "CoAAew==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IqzvOQ==", "port": 30126 }, { "ipAddressV4": "CoAAeA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "iHJx3w==", "port": 30127 }, { "ipAddressV4": "CoAAeg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I7wqdA==", "port": 30128 }, { "ipAddressV4": "CoAAfA==", "port": 30128 }] }] }
node4 7.228s 2025-09-30 05:45:07.625 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 7.229s 2025-09-30 05:45:07.626 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node4 7.244s 2025-09-30 05:45:07.641 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 2d79d2f0c62ef0c809e7eed726536d76d75b3f353607a4b44cd5ff1dcebcaf7aa8d42249f1a78abf58dc8d256014010a (root) ConsistencyTestingToolState / quality-pear-demise-train 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job
node4 7.445s 2025-09-30 05:45:07.842 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node4 7.450s 2025-09-30 05:45:07.847 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node4 7.455s 2025-09-30 05:45:07.852 47 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 7.456s 2025-09-30 05:45:07.853 48 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 7.457s 2025-09-30 05:45:07.854 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 7.460s 2025-09-30 05:45:07.857 50 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 7.461s 2025-09-30 05:45:07.858 51 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 7.461s 2025-09-30 05:45:07.858 52 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 7.463s 2025-09-30 05:45:07.860 53 WARN STARTUP <<start-node-4>> PcesFileTracker: No preconsensus event files available
node4 7.464s 2025-09-30 05:45:07.861 54 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node4 7.465s 2025-09-30 05:45:07.862 55 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node4 7.466s 2025-09-30 05:45:07.863 56 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 7.469s 2025-09-30 05:45:07.866 57 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 164.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 7.475s 2025-09-30 05:45:07.872 58 INFO PLATFORM_STATUS <platformForkJoinThread-2> StatusStateMachine: Platform spent 5.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node1 9.004s 2025-09-30 05:45:09.401 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting1.csv' ]
node1 9.007s 2025-09-30 05:45:09.404 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node2 9.134s 2025-09-30 05:45:09.531 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting2.csv' ]
node2 9.137s 2025-09-30 05:45:09.534 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node0 9.938s 2025-09-30 05:45:10.335 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting0.csv' ]
node0 9.941s 2025-09-30 05:45:10.338 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node3 9.972s 2025-09-30 05:45:10.369 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting3.csv' ]
node3 9.974s 2025-09-30 05:45:10.371 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 10.468s 2025-09-30 05:45:10.865 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 10.471s 2025-09-30 05:45:10.868 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node1 16.098s 2025-09-30 05:45:16.495 61 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node2 16.226s 2025-09-30 05:45:16.623 61 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node0 17.031s 2025-09-30 05:45:17.428 61 INFO PLATFORM_STATUS <platformForkJoinThread-7> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node3 17.068s 2025-09-30 05:45:17.465 61 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node4 17.563s 2025-09-30 05:45:17.960 61 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node3 18.283s 2025-09-30 05:45:18.680 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node2 18.358s 2025-09-30 05:45:18.755 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node1 18.362s 2025-09-30 05:45:18.759 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node4 18.388s 2025-09-30 05:45:18.785 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node0 18.416s 2025-09-30 05:45:18.813 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node0 18.735s 2025-09-30 05:45:19.132 63 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 1.7 s in CHECKING. Now in ACTIVE
node0 18.737s 2025-09-30 05:45:19.134 65 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node3 18.763s 2025-09-30 05:45:19.160 64 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node2 18.776s 2025-09-30 05:45:19.173 63 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 2.5 s in CHECKING. Now in ACTIVE
node2 18.779s 2025-09-30 05:45:19.176 65 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node1 18.780s 2025-09-30 05:45:19.177 63 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 2.7 s in CHECKING. Now in ACTIVE
node1 18.782s 2025-09-30 05:45:19.179 65 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node4 18.817s 2025-09-30 05:45:19.214 64 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node3 18.935s 2025-09-30 05:45:19.332 80 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 1.9 s in CHECKING. Now in ACTIVE
node3 18.938s 2025-09-30 05:45:19.335 82 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2
node3 18.940s 2025-09-30 05:45:19.337 83 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node4 18.973s 2025-09-30 05:45:19.370 81 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2
node1 18.974s 2025-09-30 05:45:19.371 82 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2
node4 18.975s 2025-09-30 05:45:19.372 82 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node1 18.976s 2025-09-30 05:45:19.373 83 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node2 19.045s 2025-09-30 05:45:19.442 82 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2
node2 19.047s 2025-09-30 05:45:19.444 83 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node0 19.151s 2025-09-30 05:45:19.548 84 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2
node0 19.153s 2025-09-30 05:45:19.550 85 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node3 19.215s 2025-09-30 05:45:19.612 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node1 19.217s 2025-09-30 05:45:19.614 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node3 19.218s 2025-09-30 05:45:19.615 119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-30T05:45:17.785027475Z Next consensus number: 7 Legacy running event hash: d767c021dc7eb143a9f8c055129ec1bbb16ecaabd56da3d058db786ff264cb9f6f766d3edbd61a9fdbeb1cebb6db27b6 Legacy running event mnemonic: basket-winner-icon-stumble Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: bc917e9aaa46b48ba33d49c1fd56dc76f113e74a9227bd97148344ac12464780cef8d7e9e46b427277d76989f606a3c4 (root) ConsistencyTestingToolState / hospital-spy-resemble-huge 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 expire-venue-document-clay 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node1 19.219s 2025-09-30 05:45:19.616 119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-30T05:45:17.785027475Z Next consensus number: 7 Legacy running event hash: d767c021dc7eb143a9f8c055129ec1bbb16ecaabd56da3d058db786ff264cb9f6f766d3edbd61a9fdbeb1cebb6db27b6 Legacy running event mnemonic: basket-winner-icon-stumble Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: bc917e9aaa46b48ba33d49c1fd56dc76f113e74a9227bd97148344ac12464780cef8d7e9e46b427277d76989f606a3c4 (root) ConsistencyTestingToolState / hospital-spy-resemble-huge 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 expire-venue-document-clay 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node4 19.245s 2025-09-30 05:45:19.642 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node4 19.248s 2025-09-30 05:45:19.645 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-30T05:45:17.785027475Z Next consensus number: 7 Legacy running event hash: d767c021dc7eb143a9f8c055129ec1bbb16ecaabd56da3d058db786ff264cb9f6f766d3edbd61a9fdbeb1cebb6db27b6 Legacy running event mnemonic: basket-winner-icon-stumble Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: bc917e9aaa46b48ba33d49c1fd56dc76f113e74a9227bd97148344ac12464780cef8d7e9e46b427277d76989f606a3c4 (root) ConsistencyTestingToolState / hospital-spy-resemble-huge 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 expire-venue-document-clay 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node1 19.252s 2025-09-30 05:45:19.649 120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+45+16.529211986Z_seq0_minr1_maxr501_orgn0.pces
node1 19.253s 2025-09-30 05:45:19.650 121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+45+16.529211986Z_seq0_minr1_maxr501_orgn0.pces
node1 19.253s 2025-09-30 05:45:19.650 122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 19.254s 2025-09-30 05:45:19.651 123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 19.256s 2025-09-30 05:45:19.653 120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+45+16.698864552Z_seq0_minr1_maxr501_orgn0.pces
node3 19.256s 2025-09-30 05:45:19.653 121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+45+16.698864552Z_seq0_minr1_maxr501_orgn0.pces
node3 19.257s 2025-09-30 05:45:19.654 122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 19.258s 2025-09-30 05:45:19.655 123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 19.260s 2025-09-30 05:45:19.657 124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 19.265s 2025-09-30 05:45:19.662 124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 19.289s 2025-09-30 05:45:19.686 119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/30/2025-09-30T05+45+16.781749331Z_seq0_minr1_maxr501_orgn0.pces
node4 19.290s 2025-09-30 05:45:19.687 120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/4/2025/09/30/2025-09-30T05+45+16.781749331Z_seq0_minr1_maxr501_orgn0.pces
node4 19.290s 2025-09-30 05:45:19.687 121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 19.291s 2025-09-30 05:45:19.688 122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 19.297s 2025-09-30 05:45:19.694 123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 19.348s 2025-09-30 05:45:19.745 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node2 19.351s 2025-09-30 05:45:19.748 119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-30T05:45:17.785027475Z Next consensus number: 7 Legacy running event hash: d767c021dc7eb143a9f8c055129ec1bbb16ecaabd56da3d058db786ff264cb9f6f766d3edbd61a9fdbeb1cebb6db27b6 Legacy running event mnemonic: basket-winner-icon-stumble Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: bc917e9aaa46b48ba33d49c1fd56dc76f113e74a9227bd97148344ac12464780cef8d7e9e46b427277d76989f606a3c4 (root) ConsistencyTestingToolState / hospital-spy-resemble-huge 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 expire-venue-document-clay 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node2 19.387s 2025-09-30 05:45:19.784 120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+45+16.662580277Z_seq0_minr1_maxr501_orgn0.pces
node2 19.387s 2025-09-30 05:45:19.784 121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+45+16.662580277Z_seq0_minr1_maxr501_orgn0.pces
node2 19.387s 2025-09-30 05:45:19.784 122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 19.389s 2025-09-30 05:45:19.786 123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 19.395s 2025-09-30 05:45:19.792 124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 19.399s 2025-09-30 05:45:19.796 120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node0 19.402s 2025-09-30 05:45:19.799 121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-30T05:45:17.785027475Z Next consensus number: 7 Legacy running event hash: d767c021dc7eb143a9f8c055129ec1bbb16ecaabd56da3d058db786ff264cb9f6f766d3edbd61a9fdbeb1cebb6db27b6 Legacy running event mnemonic: basket-winner-icon-stumble Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: bc917e9aaa46b48ba33d49c1fd56dc76f113e74a9227bd97148344ac12464780cef8d7e9e46b427277d76989f606a3c4 (root) ConsistencyTestingToolState / hospital-spy-resemble-huge 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 expire-venue-document-clay 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node0 19.436s 2025-09-30 05:45:19.833 122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+45+16.695999103Z_seq0_minr1_maxr501_orgn0.pces
node0 19.437s 2025-09-30 05:45:19.834 123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+45+16.695999103Z_seq0_minr1_maxr501_orgn0.pces
node0 19.437s 2025-09-30 05:45:19.834 124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 19.438s 2025-09-30 05:45:19.835 125 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 19.444s 2025-09-30 05:45:19.841 126 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 20.187s 2025-09-30 05:45:20.584 127 INFO PLATFORM_STATUS <platformForkJoinThread-7> StatusStateMachine: Platform spent 2.6 s in CHECKING. Now in ACTIVE
node2 1m 1.223s 2025-09-30 05:46:01.620 1085 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 92 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 1m 1.235s 2025-09-30 05:46:01.632 1079 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 92 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 1m 1.280s 2025-09-30 05:46:01.677 1089 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 92 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 1m 1.293s 2025-09-30 05:46:01.690 1081 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 92 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 1m 1.305s 2025-09-30 05:46:01.702 1083 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 92 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 1m 1.416s 2025-09-30 05:46:01.813 1092 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 92 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/92
node0 1m 1.417s 2025-09-30 05:46:01.814 1093 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 92
node1 1m 1.486s 2025-09-30 05:46:01.883 1086 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 92 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/92
node1 1m 1.486s 2025-09-30 05:46:01.883 1087 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 92
node3 1m 1.486s 2025-09-30 05:46:01.883 1082 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 92 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/92
node3 1m 1.487s 2025-09-30 05:46:01.884 1083 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 92
node0 1m 1.505s 2025-09-30 05:46:01.902 1126 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 92
node0 1m 1.509s 2025-09-30 05:46:01.906 1127 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 92 Timestamp: 2025-09-30T05:46:00.494265709Z Next consensus number: 3430 Legacy running event hash: a92c7fac668af60a323bd53666eaf02fdf26e0c2cabd6ef74ff6cf2926ea8f872cac98a2f0baa0aa6bae2b89e5504d25 Legacy running event mnemonic: surround-void-spend-disagree Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1716884644 Root hash: fa958bafef70f401ee9c5ccbfd29f2c77d4e3505604457dbfa4e80f0b6e63f29afaa26446319bfeb474a8717ec58c762 (root) ConsistencyTestingToolState / pulp-mechanic-bamboo-lend 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 network-length-soon-mad 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf -4385783367019147955 /3 that-olympic-wish-essay 4 StringLeaf 91 /4 slight-collect-lava-dog
node0 1m 1.518s 2025-09-30 05:46:01.915 1128 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+45+16.695999103Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 1.518s 2025-09-30 05:46:01.915 1129 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 64 File: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+45+16.695999103Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 1.518s 2025-09-30 05:46:01.915 1130 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 1m 1.521s 2025-09-30 05:46:01.918 1131 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 1m 1.522s 2025-09-30 05:46:01.919 1132 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 92 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/92 {"round":92,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/92/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 1m 1.532s 2025-09-30 05:46:01.929 1084 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 92 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/92
node4 1m 1.533s 2025-09-30 05:46:01.930 1085 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 92
node2 1m 1.535s 2025-09-30 05:46:01.932 1088 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 92 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/92
node2 1m 1.536s 2025-09-30 05:46:01.933 1089 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 92
node1 1m 1.570s 2025-09-30 05:46:01.967 1120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 92
node3 1m 1.570s 2025-09-30 05:46:01.967 1124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 92
node1 1m 1.573s 2025-09-30 05:46:01.970 1121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 92 Timestamp: 2025-09-30T05:46:00.494265709Z Next consensus number: 3430 Legacy running event hash: a92c7fac668af60a323bd53666eaf02fdf26e0c2cabd6ef74ff6cf2926ea8f872cac98a2f0baa0aa6bae2b89e5504d25 Legacy running event mnemonic: surround-void-spend-disagree Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1716884644 Root hash: fa958bafef70f401ee9c5ccbfd29f2c77d4e3505604457dbfa4e80f0b6e63f29afaa26446319bfeb474a8717ec58c762 (root) ConsistencyTestingToolState / pulp-mechanic-bamboo-lend 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 network-length-soon-mad 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf -4385783367019147955 /3 that-olympic-wish-essay 4 StringLeaf 91 /4 slight-collect-lava-dog
node3 1m 1.573s 2025-09-30 05:46:01.970 1125 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 92 Timestamp: 2025-09-30T05:46:00.494265709Z Next consensus number: 3430 Legacy running event hash: a92c7fac668af60a323bd53666eaf02fdf26e0c2cabd6ef74ff6cf2926ea8f872cac98a2f0baa0aa6bae2b89e5504d25 Legacy running event mnemonic: surround-void-spend-disagree Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1716884644 Root hash: fa958bafef70f401ee9c5ccbfd29f2c77d4e3505604457dbfa4e80f0b6e63f29afaa26446319bfeb474a8717ec58c762 (root) ConsistencyTestingToolState / pulp-mechanic-bamboo-lend 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 network-length-soon-mad 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf -4385783367019147955 /3 that-olympic-wish-essay 4 StringLeaf 91 /4 slight-collect-lava-dog
node1 1m 1.581s 2025-09-30 05:46:01.978 1122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+45+16.529211986Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 1.581s 2025-09-30 05:46:01.978 1123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 64 File: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+45+16.529211986Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 1.581s 2025-09-30 05:46:01.978 1124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 1m 1.582s 2025-09-30 05:46:01.979 1126 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+45+16.698864552Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 1.582s 2025-09-30 05:46:01.979 1127 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 64 File: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+45+16.698864552Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 1.582s 2025-09-30 05:46:01.979 1128 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 1m 1.584s 2025-09-30 05:46:01.981 1125 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 1m 1.584s 2025-09-30 05:46:01.981 1126 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 92 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/92 {"round":92,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/92/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 1m 1.585s 2025-09-30 05:46:01.982 1129 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 1m 1.586s 2025-09-30 05:46:01.983 1130 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 92 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/92 {"round":92,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/92/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 1m 1.619s 2025-09-30 05:46:02.016 1118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 92
node4 1m 1.622s 2025-09-30 05:46:02.019 1119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 92 Timestamp: 2025-09-30T05:46:00.494265709Z Next consensus number: 3430 Legacy running event hash: a92c7fac668af60a323bd53666eaf02fdf26e0c2cabd6ef74ff6cf2926ea8f872cac98a2f0baa0aa6bae2b89e5504d25 Legacy running event mnemonic: surround-void-spend-disagree Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1716884644 Root hash: fa958bafef70f401ee9c5ccbfd29f2c77d4e3505604457dbfa4e80f0b6e63f29afaa26446319bfeb474a8717ec58c762 (root) ConsistencyTestingToolState / pulp-mechanic-bamboo-lend 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 network-length-soon-mad 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf -4385783367019147955 /3 that-olympic-wish-essay 4 StringLeaf 91 /4 slight-collect-lava-dog
node2 1m 1.626s 2025-09-30 05:46:02.023 1122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 92
node2 1m 1.629s 2025-09-30 05:46:02.026 1123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 92 Timestamp: 2025-09-30T05:46:00.494265709Z Next consensus number: 3430 Legacy running event hash: a92c7fac668af60a323bd53666eaf02fdf26e0c2cabd6ef74ff6cf2926ea8f872cac98a2f0baa0aa6bae2b89e5504d25 Legacy running event mnemonic: surround-void-spend-disagree Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1716884644 Root hash: fa958bafef70f401ee9c5ccbfd29f2c77d4e3505604457dbfa4e80f0b6e63f29afaa26446319bfeb474a8717ec58c762 (root) ConsistencyTestingToolState / pulp-mechanic-bamboo-lend 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 network-length-soon-mad 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf -4385783367019147955 /3 that-olympic-wish-essay 4 StringLeaf 91 /4 slight-collect-lava-dog
node4 1m 1.633s 2025-09-30 05:46:02.030 1120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/30/2025-09-30T05+45+16.781749331Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 1.634s 2025-09-30 05:46:02.031 1121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 64 File: data/saved/preconsensus-events/4/2025/09/30/2025-09-30T05+45+16.781749331Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 1.634s 2025-09-30 05:46:02.031 1122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 1m 1.637s 2025-09-30 05:46:02.034 1123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 1m 1.637s 2025-09-30 05:46:02.034 1124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 92 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/92 {"round":92,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/92/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 1m 1.640s 2025-09-30 05:46:02.037 1124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+45+16.662580277Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 1.641s 2025-09-30 05:46:02.038 1125 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 64 File: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+45+16.662580277Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 1.641s 2025-09-30 05:46:02.038 1126 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 1m 1.643s 2025-09-30 05:46:02.040 1127 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 1m 1.644s 2025-09-30 05:46:02.041 1128 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 92 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/92 {"round":92,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/92/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 2.013m 2025-09-30 05:47:01.185 2470 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 213 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 2.014m 2025-09-30 05:47:01.221 2454 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 213 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 2.015m 2025-09-30 05:47:01.290 2472 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 213 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 2.016m 2025-09-30 05:47:01.337 2466 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 213 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 2.016m 2025-09-30 05:47:01.344 2434 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 213 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 2m 1.013s 2025-09-30 05:47:01.410 2437 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 213 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/213
node3 2m 1.014s 2025-09-30 05:47:01.411 2438 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 213
node2 2m 1.085s 2025-09-30 05:47:01.482 2469 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 213 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/213
node2 2m 1.086s 2025-09-30 05:47:01.483 2470 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 213
node3 2m 1.109s 2025-09-30 05:47:01.506 2469 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 213
node3 2m 1.112s 2025-09-30 05:47:01.509 2470 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 213 Timestamp: 2025-09-30T05:47:00.259714504Z Next consensus number: 8261 Legacy running event hash: bf7686ad6b322a2a4174e9c662b948efaf0b314c82f6e305c838d53bd3febf93eb392a2e3c3c76c722ab849fc852b3bd Legacy running event mnemonic: bacon-clerk-green-earn Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1216923955 Root hash: 490d58644cd90a516012e3559fbe8bd4b108c0122c612c3003c86155b0a92bbdd60ec7977e1d858b77e71fd081c4b80d (root) ConsistencyTestingToolState / canvas-arrive-open-enforce 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 carpet-ask-imitate-equal 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf -5776681336046445903 /3 bag-symbol-fluid-jar 4 StringLeaf 212 /4 supreme-absent-subway-fork
node3 2m 1.120s 2025-09-30 05:47:01.517 2471 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+45+16.698864552Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 1.120s 2025-09-30 05:47:01.517 2472 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 184 File: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+45+16.698864552Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 1.120s 2025-09-30 05:47:01.517 2473 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 2m 1.126s 2025-09-30 05:47:01.523 2474 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 2m 1.127s 2025-09-30 05:47:01.524 2475 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 213 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/213 {"round":213,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/213/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 2m 1.156s 2025-09-30 05:47:01.553 2457 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 213 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/213
node1 2m 1.156s 2025-09-30 05:47:01.553 2458 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 213
node0 2m 1.163s 2025-09-30 05:47:01.560 2473 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 213 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/213
node0 2m 1.164s 2025-09-30 05:47:01.561 2474 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 213
node2 2m 1.177s 2025-09-30 05:47:01.574 2505 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 213
node2 2m 1.179s 2025-09-30 05:47:01.576 2506 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 213 Timestamp: 2025-09-30T05:47:00.259714504Z Next consensus number: 8261 Legacy running event hash: bf7686ad6b322a2a4174e9c662b948efaf0b314c82f6e305c838d53bd3febf93eb392a2e3c3c76c722ab849fc852b3bd Legacy running event mnemonic: bacon-clerk-green-earn Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1216923955 Root hash: 490d58644cd90a516012e3559fbe8bd4b108c0122c612c3003c86155b0a92bbdd60ec7977e1d858b77e71fd081c4b80d (root) ConsistencyTestingToolState / canvas-arrive-open-enforce 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 carpet-ask-imitate-equal 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf -5776681336046445903 /3 bag-symbol-fluid-jar 4 StringLeaf 212 /4 supreme-absent-subway-fork
node4 2m 1.183s 2025-09-30 05:47:01.580 2475 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 213 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/213
node4 2m 1.184s 2025-09-30 05:47:01.581 2476 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 213
node2 2m 1.187s 2025-09-30 05:47:01.584 2507 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+45+16.662580277Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 1.187s 2025-09-30 05:47:01.584 2508 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 184 File: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+45+16.662580277Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 1.187s 2025-09-30 05:47:01.584 2509 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 2m 1.193s 2025-09-30 05:47:01.590 2510 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 2m 1.194s 2025-09-30 05:47:01.591 2511 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 213 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/213 {"round":213,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/213/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 2m 1.248s 2025-09-30 05:47:01.645 2493 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 213
node1 2m 1.250s 2025-09-30 05:47:01.647 2494 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 213 Timestamp: 2025-09-30T05:47:00.259714504Z Next consensus number: 8261 Legacy running event hash: bf7686ad6b322a2a4174e9c662b948efaf0b314c82f6e305c838d53bd3febf93eb392a2e3c3c76c722ab849fc852b3bd Legacy running event mnemonic: bacon-clerk-green-earn Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1216923955 Root hash: 490d58644cd90a516012e3559fbe8bd4b108c0122c612c3003c86155b0a92bbdd60ec7977e1d858b77e71fd081c4b80d (root) ConsistencyTestingToolState / canvas-arrive-open-enforce 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 carpet-ask-imitate-equal 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf -5776681336046445903 /3 bag-symbol-fluid-jar 4 StringLeaf 212 /4 supreme-absent-subway-fork
node1 2m 1.256s 2025-09-30 05:47:01.653 2495 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+45+16.529211986Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 1.257s 2025-09-30 05:47:01.654 2496 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 184 File: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+45+16.529211986Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 1.257s 2025-09-30 05:47:01.654 2497 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 2m 1.263s 2025-09-30 05:47:01.660 2498 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 2m 1.263s 2025-09-30 05:47:01.660 2499 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 213 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/213 {"round":213,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/213/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 2m 1.264s 2025-09-30 05:47:01.661 2513 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 213
node0 2m 1.266s 2025-09-30 05:47:01.663 2514 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 213 Timestamp: 2025-09-30T05:47:00.259714504Z Next consensus number: 8261 Legacy running event hash: bf7686ad6b322a2a4174e9c662b948efaf0b314c82f6e305c838d53bd3febf93eb392a2e3c3c76c722ab849fc852b3bd Legacy running event mnemonic: bacon-clerk-green-earn Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1216923955 Root hash: 490d58644cd90a516012e3559fbe8bd4b108c0122c612c3003c86155b0a92bbdd60ec7977e1d858b77e71fd081c4b80d (root) ConsistencyTestingToolState / canvas-arrive-open-enforce 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 carpet-ask-imitate-equal 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf -5776681336046445903 /3 bag-symbol-fluid-jar 4 StringLeaf 212 /4 supreme-absent-subway-fork
node0 2m 1.275s 2025-09-30 05:47:01.672 2515 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+45+16.695999103Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 1.276s 2025-09-30 05:47:01.673 2516 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 184 File: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+45+16.695999103Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 1.276s 2025-09-30 05:47:01.673 2517 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 2m 1.279s 2025-09-30 05:47:01.676 2511 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 213
node0 2m 1.282s 2025-09-30 05:47:01.679 2518 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 2m 1.282s 2025-09-30 05:47:01.679 2519 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 213 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/213 {"round":213,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/213/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 2m 1.282s 2025-09-30 05:47:01.679 2512 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 213 Timestamp: 2025-09-30T05:47:00.259714504Z Next consensus number: 8261 Legacy running event hash: bf7686ad6b322a2a4174e9c662b948efaf0b314c82f6e305c838d53bd3febf93eb392a2e3c3c76c722ab849fc852b3bd Legacy running event mnemonic: bacon-clerk-green-earn Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1216923955 Root hash: 490d58644cd90a516012e3559fbe8bd4b108c0122c612c3003c86155b0a92bbdd60ec7977e1d858b77e71fd081c4b80d (root) ConsistencyTestingToolState / canvas-arrive-open-enforce 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 carpet-ask-imitate-equal 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf -5776681336046445903 /3 bag-symbol-fluid-jar 4 StringLeaf 212 /4 supreme-absent-subway-fork
node4 2m 1.289s 2025-09-30 05:47:01.686 2513 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/30/2025-09-30T05+45+16.781749331Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 1.290s 2025-09-30 05:47:01.687 2514 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 184 File: data/saved/preconsensus-events/4/2025/09/30/2025-09-30T05+45+16.781749331Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 1.290s 2025-09-30 05:47:01.687 2515 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 2m 1.296s 2025-09-30 05:47:01.693 2516 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 2m 1.296s 2025-09-30 05:47:01.693 2517 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 213 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/213 {"round":213,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/213/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 3m 1.228s 2025-09-30 05:48:01.625 3915 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 340 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 3m 1.233s 2025-09-30 05:48:01.630 3861 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 340 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 3m 1.271s 2025-09-30 05:48:01.668 3877 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 340 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 3m 1.272s 2025-09-30 05:48:01.669 3883 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 340 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 3m 1.360s 2025-09-30 05:48:01.757 3891 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 340 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 3m 1.403s 2025-09-30 05:48:01.800 3887 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 340 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/340
node1 3m 1.404s 2025-09-30 05:48:01.801 3890 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 340
node2 3m 1.462s 2025-09-30 05:48:01.859 3883 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 340 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/340
node2 3m 1.462s 2025-09-30 05:48:01.859 3884 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 340
node0 3m 1.476s 2025-09-30 05:48:01.873 3918 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 340 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/340
node0 3m 1.476s 2025-09-30 05:48:01.873 3920 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 340
node1 3m 1.491s 2025-09-30 05:48:01.888 3926 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 340
node1 3m 1.493s 2025-09-30 05:48:01.890 3927 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 340 Timestamp: 2025-09-30T05:48:00.274742119Z Next consensus number: 13065 Legacy running event hash: beeb878368767d76e5fb8a2aeca4e52fdfff423011fc5e114548541fc50c1469ff349a153c6305a3461e0a23ba681ac7 Legacy running event mnemonic: cargo-autumn-attract-put Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799423150 Root hash: 5075ae11590773224e41d4f1ca87efdd8393327eacc12df4f15f2f96aab469e475d6793ee64bd11110515581a13f9495 (root) ConsistencyTestingToolState / attitude-digital-can-gold 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 reject-shiver-donkey-engage 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 2285048525102338985 /3 grass-flock-like-wage 4 StringLeaf 339 /4 very-bulb-swallow-exotic
node1 3m 1.499s 2025-09-30 05:48:01.896 3928 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+45+16.529211986Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 1.500s 2025-09-30 05:48:01.897 3929 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 312 File: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+45+16.529211986Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 1.500s 2025-09-30 05:48:01.897 3930 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 3m 1.500s 2025-09-30 05:48:01.897 3895 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 340 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/340
node4 3m 1.501s 2025-09-30 05:48:01.898 3898 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 340
node1 3m 1.509s 2025-09-30 05:48:01.906 3931 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 3m 1.509s 2025-09-30 05:48:01.906 3932 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 340 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/340 {"round":340,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/340/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 3m 1.533s 2025-09-30 05:48:01.930 3865 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 340 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/340
node3 3m 1.534s 2025-09-30 05:48:01.931 3868 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 340
node2 3m 1.550s 2025-09-30 05:48:01.947 3924 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 340
node2 3m 1.552s 2025-09-30 05:48:01.949 3925 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 340 Timestamp: 2025-09-30T05:48:00.274742119Z Next consensus number: 13065 Legacy running event hash: beeb878368767d76e5fb8a2aeca4e52fdfff423011fc5e114548541fc50c1469ff349a153c6305a3461e0a23ba681ac7 Legacy running event mnemonic: cargo-autumn-attract-put Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799423150 Root hash: 5075ae11590773224e41d4f1ca87efdd8393327eacc12df4f15f2f96aab469e475d6793ee64bd11110515581a13f9495 (root) ConsistencyTestingToolState / attitude-digital-can-gold 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 reject-shiver-donkey-engage 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 2285048525102338985 /3 grass-flock-like-wage 4 StringLeaf 339 /4 very-bulb-swallow-exotic
node2 3m 1.560s 2025-09-30 05:48:01.957 3926 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+45+16.662580277Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 1.560s 2025-09-30 05:48:01.957 3927 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 312 File: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+45+16.662580277Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 1.560s 2025-09-30 05:48:01.957 3928 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 3m 1.569s 2025-09-30 05:48:01.966 3929 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 3m 1.570s 2025-09-30 05:48:01.967 3962 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 340
node2 3m 1.570s 2025-09-30 05:48:01.967 3930 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 340 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/340 {"round":340,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/340/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 3m 1.573s 2025-09-30 05:48:01.970 3963 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 340 Timestamp: 2025-09-30T05:48:00.274742119Z Next consensus number: 13065 Legacy running event hash: beeb878368767d76e5fb8a2aeca4e52fdfff423011fc5e114548541fc50c1469ff349a153c6305a3461e0a23ba681ac7 Legacy running event mnemonic: cargo-autumn-attract-put Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799423150 Root hash: 5075ae11590773224e41d4f1ca87efdd8393327eacc12df4f15f2f96aab469e475d6793ee64bd11110515581a13f9495 (root) ConsistencyTestingToolState / attitude-digital-can-gold 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 reject-shiver-donkey-engage 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 2285048525102338985 /3 grass-flock-like-wage 4 StringLeaf 339 /4 very-bulb-swallow-exotic
node0 3m 1.580s 2025-09-30 05:48:01.977 3964 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+45+16.695999103Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 1.580s 2025-09-30 05:48:01.977 3965 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 312 File: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+45+16.695999103Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 1.580s 2025-09-30 05:48:01.977 3966 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 3m 1.589s 2025-09-30 05:48:01.986 3967 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 3m 1.590s 2025-09-30 05:48:01.987 3968 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 340 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/340 {"round":340,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/340/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 3m 1.611s 2025-09-30 05:48:02.008 3934 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 340
node4 3m 1.614s 2025-09-30 05:48:02.011 3935 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 340 Timestamp: 2025-09-30T05:48:00.274742119Z Next consensus number: 13065 Legacy running event hash: beeb878368767d76e5fb8a2aeca4e52fdfff423011fc5e114548541fc50c1469ff349a153c6305a3461e0a23ba681ac7 Legacy running event mnemonic: cargo-autumn-attract-put Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799423150 Root hash: 5075ae11590773224e41d4f1ca87efdd8393327eacc12df4f15f2f96aab469e475d6793ee64bd11110515581a13f9495 (root) ConsistencyTestingToolState / attitude-digital-can-gold 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 reject-shiver-donkey-engage 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 2285048525102338985 /3 grass-flock-like-wage 4 StringLeaf 339 /4 very-bulb-swallow-exotic
node4 3m 1.625s 2025-09-30 05:48:02.022 3936 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/30/2025-09-30T05+45+16.781749331Z_seq0_minr1_maxr501_orgn0.pces
node4 3m 1.625s 2025-09-30 05:48:02.022 3937 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 312 File: data/saved/preconsensus-events/4/2025/09/30/2025-09-30T05+45+16.781749331Z_seq0_minr1_maxr501_orgn0.pces
node4 3m 1.626s 2025-09-30 05:48:02.023 3938 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 3m 1.629s 2025-09-30 05:48:02.026 3908 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 340
node3 3m 1.631s 2025-09-30 05:48:02.028 3909 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 340 Timestamp: 2025-09-30T05:48:00.274742119Z Next consensus number: 13065 Legacy running event hash: beeb878368767d76e5fb8a2aeca4e52fdfff423011fc5e114548541fc50c1469ff349a153c6305a3461e0a23ba681ac7 Legacy running event mnemonic: cargo-autumn-attract-put Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799423150 Root hash: 5075ae11590773224e41d4f1ca87efdd8393327eacc12df4f15f2f96aab469e475d6793ee64bd11110515581a13f9495 (root) ConsistencyTestingToolState / attitude-digital-can-gold 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 reject-shiver-donkey-engage 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 2285048525102338985 /3 grass-flock-like-wage 4 StringLeaf 339 /4 very-bulb-swallow-exotic
node4 3m 1.635s 2025-09-30 05:48:02.032 3939 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 3m 1.636s 2025-09-30 05:48:02.033 3940 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 340 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/340 {"round":340,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/340/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 3m 1.638s 2025-09-30 05:48:02.035 3910 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+45+16.698864552Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 1.639s 2025-09-30 05:48:02.036 3911 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 312 File: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+45+16.698864552Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 1.639s 2025-09-30 05:48:02.036 3912 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 3m 1.647s 2025-09-30 05:48:02.044 3913 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 3m 1.648s 2025-09-30 05:48:02.045 3914 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 340 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/340 {"round":340,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/340/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 4.011m 2025-09-30 05:49:01.067 5476 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 477 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 4.012m 2025-09-30 05:49:01.101 5470 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 477 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 4.013m 2025-09-30 05:49:01.150 5436 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 477 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 4.015m 2025-09-30 05:49:01.272 5546 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 477 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 4.016m 2025-09-30 05:49:01.339 5439 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 477 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/477
node1 4.016m 2025-09-30 05:49:01.340 5440 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 477
node0 4.016m 2025-09-30 05:49:01.343 5549 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 477 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/477
node0 4.016m 2025-09-30 05:49:01.344 5550 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 477
node3 4m 1.012s 2025-09-30 05:49:01.409 5483 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 477 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/477
node3 4m 1.013s 2025-09-30 05:49:01.410 5484 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 477
node1 4m 1.028s 2025-09-30 05:49:01.425 5475 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 477
node1 4m 1.030s 2025-09-30 05:49:01.427 5476 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 477 Timestamp: 2025-09-30T05:49:00.194669Z Next consensus number: 16695 Legacy running event hash: 02d0abfbce77280a9ae59df7e88fcbf912cbb565aaab7e8719e48a1a953206313564a7913b4fbaca6f00d72912bd3500 Legacy running event mnemonic: foam-level-couple-access Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -445852267 Root hash: ef2644ab64855bdc9da0feeb24c60536b7d2e54bfe0e21d4c922f794f57bfd2ef56d852a912ad2c1d87abbfc9f812ce1 (root) ConsistencyTestingToolState / brave-nice-mouse-knee 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 logic-bridge-sugar-fancy 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 5490647917212741423 /3 senior-sample-normal-arrange 4 StringLeaf 476 /4 vapor-cart-pool-never
node1 4m 1.037s 2025-09-30 05:49:01.434 5477 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+45+16.529211986Z_seq0_minr1_maxr501_orgn0.pces
node1 4m 1.038s 2025-09-30 05:49:01.435 5478 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 450 File: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+45+16.529211986Z_seq0_minr1_maxr501_orgn0.pces
node1 4m 1.038s 2025-09-30 05:49:01.435 5479 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 4m 1.041s 2025-09-30 05:49:01.438 5581 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 477
node0 4m 1.043s 2025-09-30 05:49:01.440 5582 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 477 Timestamp: 2025-09-30T05:49:00.194669Z Next consensus number: 16695 Legacy running event hash: 02d0abfbce77280a9ae59df7e88fcbf912cbb565aaab7e8719e48a1a953206313564a7913b4fbaca6f00d72912bd3500 Legacy running event mnemonic: foam-level-couple-access Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -445852267 Root hash: ef2644ab64855bdc9da0feeb24c60536b7d2e54bfe0e21d4c922f794f57bfd2ef56d852a912ad2c1d87abbfc9f812ce1 (root) ConsistencyTestingToolState / brave-nice-mouse-knee 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 logic-bridge-sugar-fancy 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 5490647917212741423 /3 senior-sample-normal-arrange 4 StringLeaf 476 /4 vapor-cart-pool-never
node1 4m 1.049s 2025-09-30 05:49:01.446 5480 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 4m 1.050s 2025-09-30 05:49:01.447 5481 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 477 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/477 {"round":477,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/477/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 4m 1.051s 2025-09-30 05:49:01.448 5583 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+45+16.695999103Z_seq0_minr1_maxr501_orgn0.pces
node0 4m 1.051s 2025-09-30 05:49:01.448 5584 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 450 File: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+45+16.695999103Z_seq0_minr1_maxr501_orgn0.pces
node0 4m 1.051s 2025-09-30 05:49:01.448 5585 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 4m 1.063s 2025-09-30 05:49:01.460 5586 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 4m 1.063s 2025-09-30 05:49:01.460 5587 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 477 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/477 {"round":477,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/477/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 4m 1.093s 2025-09-30 05:49:01.490 5489 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 477 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/477
node2 4m 1.093s 2025-09-30 05:49:01.490 5490 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 477
node3 4m 1.100s 2025-09-30 05:49:01.497 5515 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 477
node3 4m 1.102s 2025-09-30 05:49:01.499 5516 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 477 Timestamp: 2025-09-30T05:49:00.194669Z Next consensus number: 16695 Legacy running event hash: 02d0abfbce77280a9ae59df7e88fcbf912cbb565aaab7e8719e48a1a953206313564a7913b4fbaca6f00d72912bd3500 Legacy running event mnemonic: foam-level-couple-access Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -445852267 Root hash: ef2644ab64855bdc9da0feeb24c60536b7d2e54bfe0e21d4c922f794f57bfd2ef56d852a912ad2c1d87abbfc9f812ce1 (root) ConsistencyTestingToolState / brave-nice-mouse-knee 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 logic-bridge-sugar-fancy 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 5490647917212741423 /3 senior-sample-normal-arrange 4 StringLeaf 476 /4 vapor-cart-pool-never
node3 4m 1.110s 2025-09-30 05:49:01.507 5517 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+45+16.698864552Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 1.110s 2025-09-30 05:49:01.507 5518 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 450 File: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+45+16.698864552Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 1.110s 2025-09-30 05:49:01.507 5519 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 4m 1.121s 2025-09-30 05:49:01.518 5520 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 4m 1.122s 2025-09-30 05:49:01.519 5521 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 477 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/477 {"round":477,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/477/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 4m 1.181s 2025-09-30 05:49:01.578 5525 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 477
node2 4m 1.183s 2025-09-30 05:49:01.580 5526 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 477 Timestamp: 2025-09-30T05:49:00.194669Z Next consensus number: 16695 Legacy running event hash: 02d0abfbce77280a9ae59df7e88fcbf912cbb565aaab7e8719e48a1a953206313564a7913b4fbaca6f00d72912bd3500 Legacy running event mnemonic: foam-level-couple-access Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -445852267 Root hash: ef2644ab64855bdc9da0feeb24c60536b7d2e54bfe0e21d4c922f794f57bfd2ef56d852a912ad2c1d87abbfc9f812ce1 (root) ConsistencyTestingToolState / brave-nice-mouse-knee 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 logic-bridge-sugar-fancy 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 5490647917212741423 /3 senior-sample-normal-arrange 4 StringLeaf 476 /4 vapor-cart-pool-never
node2 4m 1.193s 2025-09-30 05:49:01.590 5527 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+45+16.662580277Z_seq0_minr1_maxr501_orgn0.pces
node2 4m 1.193s 2025-09-30 05:49:01.590 5528 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 450 File: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+45+16.662580277Z_seq0_minr1_maxr501_orgn0.pces
node2 4m 1.193s 2025-09-30 05:49:01.590 5529 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 4m 1.204s 2025-09-30 05:49:01.601 5530 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 4m 1.205s 2025-09-30 05:49:01.602 5531 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 477 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/477 {"round":477,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/477/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 5.010m 2025-09-30 05:50:01.000 7086 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 615 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 5.012m 2025-09-30 05:50:01.088 7124 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 615 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 5.012m 2025-09-30 05:50:01.090 7082 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 615 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 5.012m 2025-09-30 05:50:01.141 7134 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 615 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 5.015m 2025-09-30 05:50:01.304 7137 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 615 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/615
node2 5.015m 2025-09-30 05:50:01.304 7138 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 615
node3 5.015m 2025-09-30 05:50:01.304 7085 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 615 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/615
node3 5.015m 2025-09-30 05:50:01.305 7086 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 615
node1 5.016m 2025-09-30 05:50:01.357 7089 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 615 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/615
node1 5.016m 2025-09-30 05:50:01.358 7090 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 615
node2 5.017m 2025-09-30 05:50:01.391 7173 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 615
node2 5.017m 2025-09-30 05:50:01.393 7174 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 615 Timestamp: 2025-09-30T05:50:00.282013617Z Next consensus number: 20018 Legacy running event hash: 5b170f2f0867bda52bd53d8e3d7472b8f9bb63b5635ada98752381e50c3e7459b332c6604606ef32518e62e97fddbd57 Legacy running event mnemonic: glass-tourist-artefact-enter Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -675868155 Root hash: 57496c90ca16eabcf0c4ffec9f33494638109627d42e8b29a79927acf73a22987330453c5094fd5b48830c268308587e (root) ConsistencyTestingToolState / truck-must-run-possible 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 fragile-direct-reveal-hedgehog 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 1104848267977956130 /3 item-surge-enrich-dose 4 StringLeaf 614 /4 universe-monster-error-exclude
node3 5.017m 2025-09-30 05:50:01.394 7121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 615
node3 5.017m 2025-09-30 05:50:01.396 7122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 615 Timestamp: 2025-09-30T05:50:00.282013617Z Next consensus number: 20018 Legacy running event hash: 5b170f2f0867bda52bd53d8e3d7472b8f9bb63b5635ada98752381e50c3e7459b332c6604606ef32518e62e97fddbd57 Legacy running event mnemonic: glass-tourist-artefact-enter Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -675868155 Root hash: 57496c90ca16eabcf0c4ffec9f33494638109627d42e8b29a79927acf73a22987330453c5094fd5b48830c268308587e (root) ConsistencyTestingToolState / truck-must-run-possible 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 fragile-direct-reveal-hedgehog 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 1104848267977956130 /3 item-surge-enrich-dose 4 StringLeaf 614 /4 universe-monster-error-exclude
node2 5m 1.003s 2025-09-30 05:50:01.400 7175 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+45+16.662580277Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+49+11.525370091Z_seq1_minr474_maxr5474_orgn0.pces
node2 5m 1.003s 2025-09-30 05:50:01.400 7176 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 588 File: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+49+11.525370091Z_seq1_minr474_maxr5474_orgn0.pces
node2 5m 1.003s 2025-09-30 05:50:01.400 7177 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 5m 1.005s 2025-09-30 05:50:01.402 7178 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 5m 1.006s 2025-09-30 05:50:01.403 7179 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 615 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/615 {"round":615,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/615/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 5m 1.006s 2025-09-30 05:50:01.403 7123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+49+11.568413001Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+45+16.698864552Z_seq0_minr1_maxr501_orgn0.pces
node3 5m 1.006s 2025-09-30 05:50:01.403 7124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 588 File: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+49+11.568413001Z_seq1_minr474_maxr5474_orgn0.pces
node3 5m 1.006s 2025-09-30 05:50:01.403 7125 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 5m 1.008s 2025-09-30 05:50:01.405 7180 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2
node3 5m 1.008s 2025-09-30 05:50:01.405 7126 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 5m 1.009s 2025-09-30 05:50:01.406 7127 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 615 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/615 {"round":615,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/615/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 5m 1.011s 2025-09-30 05:50:01.408 7128 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2
node1 5m 1.044s 2025-09-30 05:50:01.441 7128 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 615
node1 5m 1.046s 2025-09-30 05:50:01.443 7129 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 615 Timestamp: 2025-09-30T05:50:00.282013617Z Next consensus number: 20018 Legacy running event hash: 5b170f2f0867bda52bd53d8e3d7472b8f9bb63b5635ada98752381e50c3e7459b332c6604606ef32518e62e97fddbd57 Legacy running event mnemonic: glass-tourist-artefact-enter Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -675868155 Root hash: 57496c90ca16eabcf0c4ffec9f33494638109627d42e8b29a79927acf73a22987330453c5094fd5b48830c268308587e (root) ConsistencyTestingToolState / truck-must-run-possible 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 fragile-direct-reveal-hedgehog 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 1104848267977956130 /3 item-surge-enrich-dose 4 StringLeaf 614 /4 universe-monster-error-exclude
node1 5m 1.053s 2025-09-30 05:50:01.450 7130 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+45+16.529211986Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+49+11.558579248Z_seq1_minr474_maxr5474_orgn0.pces
node1 5m 1.053s 2025-09-30 05:50:01.450 7131 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 588 File: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+49+11.558579248Z_seq1_minr474_maxr5474_orgn0.pces
node1 5m 1.053s 2025-09-30 05:50:01.450 7132 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 5m 1.055s 2025-09-30 05:50:01.452 7133 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 5m 1.056s 2025-09-30 05:50:01.453 7134 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 615 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/615 {"round":615,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/615/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 5m 1.057s 2025-09-30 05:50:01.454 7135 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2
node0 5m 1.123s 2025-09-30 05:50:01.520 7137 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 615 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/615
node0 5m 1.124s 2025-09-30 05:50:01.521 7139 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 615
node0 5m 1.216s 2025-09-30 05:50:01.613 7181 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 615
node0 5m 1.218s 2025-09-30 05:50:01.615 7182 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 615 Timestamp: 2025-09-30T05:50:00.282013617Z Next consensus number: 20018 Legacy running event hash: 5b170f2f0867bda52bd53d8e3d7472b8f9bb63b5635ada98752381e50c3e7459b332c6604606ef32518e62e97fddbd57 Legacy running event mnemonic: glass-tourist-artefact-enter Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -675868155 Root hash: 57496c90ca16eabcf0c4ffec9f33494638109627d42e8b29a79927acf73a22987330453c5094fd5b48830c268308587e (root) ConsistencyTestingToolState / truck-must-run-possible 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 fragile-direct-reveal-hedgehog 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 1104848267977956130 /3 item-surge-enrich-dose 4 StringLeaf 614 /4 universe-monster-error-exclude
node0 5m 1.226s 2025-09-30 05:50:01.623 7183 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+45+16.695999103Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+49+11.675097831Z_seq1_minr473_maxr5473_orgn0.pces
node0 5m 1.226s 2025-09-30 05:50:01.623 7184 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 588 File: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+49+11.675097831Z_seq1_minr473_maxr5473_orgn0.pces
node0 5m 1.226s 2025-09-30 05:50:01.623 7185 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 5m 1.228s 2025-09-30 05:50:01.625 7186 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 5m 1.229s 2025-09-30 05:50:01.626 7187 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 615 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/615 {"round":615,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/615/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 5m 1.231s 2025-09-30 05:50:01.628 7188 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2
node4 5m 51.818s 2025-09-30 05:50:52.215 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 5m 51.924s 2025-09-30 05:50:52.321 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 5m 51.941s 2025-09-30 05:50:52.338 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 52.075s 2025-09-30 05:50:52.472 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 5m 52.083s 2025-09-30 05:50:52.480 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 5m 52.097s 2025-09-30 05:50:52.494 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 52.568s 2025-09-30 05:50:52.965 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 5m 52.569s 2025-09-30 05:50:52.966 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 5m 53.540s 2025-09-30 05:50:53.937 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 971ms
node4 5m 53.551s 2025-09-30 05:50:53.948 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 5m 53.554s 2025-09-30 05:50:53.951 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 53.601s 2025-09-30 05:50:53.998 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 5m 53.686s 2025-09-30 05:50:54.083 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 5m 53.689s 2025-09-30 05:50:54.086 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 5m 55.808s 2025-09-30 05:50:56.205 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 5m 55.898s 2025-09-30 05:50:56.295 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 55.909s 2025-09-30 05:50:56.306 21 INFO STARTUP <main> StartupStateUtils: The following saved states were found on disk:
- /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/340/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/213/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/92/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2/SignedState.swh
node4 5m 55.910s 2025-09-30 05:50:56.307 22 INFO STARTUP <main> StartupStateUtils: Loading latest state from disk.
node4 5m 55.910s 2025-09-30 05:50:56.307 23 INFO STARTUP <main> StartupStateUtils: Loading signed state from disk: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/340/SignedState.swh
node4 5m 55.915s 2025-09-30 05:50:56.312 24 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 55.921s 2025-09-30 05:50:56.318 25 INFO STATE_TO_DISK <main> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp
node4 5m 56.059s 2025-09-30 05:50:56.456 36 INFO STARTUP <main> StartupStateUtils: Loaded state's hash is the same as when it was saved.
node4 5m 56.062s 2025-09-30 05:50:56.459 37 INFO STARTUP <main> StartupStateUtils: Platform has loaded a saved state {"round":340,"consensusTimestamp":"2025-09-30T05:48:00.274742119Z"} [com.swirlds.logging.legacy.payload.SavedStateLoadedPayload]
node4 5m 56.065s 2025-09-30 05:50:56.462 40 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 56.072s 2025-09-30 05:50:56.469 43 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 5m 56.075s 2025-09-30 05:50:56.472 44 INFO STARTUP <main> AddressBookInitializer: Using the loaded state's address book and weight values.
node4 5m 56.081s 2025-09-30 05:50:56.478 45 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 56.083s 2025-09-30 05:50:56.480 46 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 57.134s 2025-09-30 05:50:57.531 47 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26130740] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=421460, randomLong=8711652395786763606, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=8290, randomLong=305646656407578827, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1632570, data=35, exception=null] OS Health Check Report - Complete (took 1038 ms)
node4 5m 57.168s 2025-09-30 05:50:57.565 48 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 5m 57.300s 2025-09-30 05:50:57.697 49 INFO STARTUP <main> PcesUtilities: Span compaction completed for data/saved/preconsensus-events/4/2025/09/30/2025-09-30T05+45+16.781749331Z_seq0_minr1_maxr501_orgn0.pces, new upper bound is 364
node4 5m 57.302s 2025-09-30 05:50:57.699 50 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 5m 57.306s 2025-09-30 05:50:57.703 51 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 5m 57.380s 2025-09-30 05:50:57.777 52 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "iHDorA==", "port": 30124 }, { "ipAddressV4": "CoAAeQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IjghRQ==", "port": 30125 }, { "ipAddressV4": "CoAAew==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IqzvOQ==", "port": 30126 }, { "ipAddressV4": "CoAAeA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "iHJx3w==", "port": 30127 }, { "ipAddressV4": "CoAAeg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I7wqdA==", "port": 30128 }, { "ipAddressV4": "CoAAfA==", "port": 30128 }] }] }
node4 5m 57.400s 2025-09-30 05:50:57.797 53 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with state long 2285048525102338985.
node4 5m 57.400s 2025-09-30 05:50:57.797 54 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with 339 rounds handled.
node4 5m 57.401s 2025-09-30 05:50:57.798 55 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 5m 57.401s 2025-09-30 05:50:57.798 56 INFO STARTUP <main> TransactionHandlingHistory: Log file found. Parsing previous history
node4 5m 58.216s 2025-09-30 05:50:58.613 57 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 340 Timestamp: 2025-09-30T05:48:00.274742119Z Next consensus number: 13065 Legacy running event hash: beeb878368767d76e5fb8a2aeca4e52fdfff423011fc5e114548541fc50c1469ff349a153c6305a3461e0a23ba681ac7 Legacy running event mnemonic: cargo-autumn-attract-put Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799423150 Root hash: 5075ae11590773224e41d4f1ca87efdd8393327eacc12df4f15f2f96aab469e475d6793ee64bd11110515581a13f9495 (root) ConsistencyTestingToolState / attitude-digital-can-gold 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 reject-shiver-donkey-engage 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 2285048525102338985 /3 grass-flock-like-wage 4 StringLeaf 339 /4 very-bulb-swallow-exotic
node4 5m 58.479s 2025-09-30 05:50:58.876 59 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: beeb878368767d76e5fb8a2aeca4e52fdfff423011fc5e114548541fc50c1469ff349a153c6305a3461e0a23ba681ac7
node4 5m 58.494s 2025-09-30 05:50:58.891 60 INFO STARTUP <platformForkJoinThread-4> Shadowgraph: Shadowgraph starting from expiration threshold 312
node4 5m 58.502s 2025-09-30 05:50:58.899 62 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 5m 58.503s 2025-09-30 05:50:58.900 63 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 5m 58.505s 2025-09-30 05:50:58.902 64 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 5m 58.508s 2025-09-30 05:50:58.905 65 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 5m 58.510s 2025-09-30 05:50:58.907 66 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 5m 58.510s 2025-09-30 05:50:58.907 67 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 5m 58.513s 2025-09-30 05:50:58.910 68 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 312
node4 5m 58.520s 2025-09-30 05:50:58.917 69 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 208.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 5m 58.833s 2025-09-30 05:50:59.230 70 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:b28093f01357 BR:338), num remaining: 4
node4 5m 58.834s 2025-09-30 05:50:59.231 71 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:b4c6d8ee8861 BR:338), num remaining: 3
node4 5m 58.835s 2025-09-30 05:50:59.232 72 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:d05fb05e0c40 BR:338), num remaining: 2
node4 5m 58.836s 2025-09-30 05:50:59.233 73 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:4 H:e8ca6d9dd757 BR:339), num remaining: 1
node4 5m 58.836s 2025-09-30 05:50:59.233 74 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:66d8ae9f22ef BR:338), num remaining: 0
node4 5m 59.017s 2025-09-30 05:50:59.414 153 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 1,908 preconsensus events with max birth round 364. These events contained 2,617 transactions. 23 rounds reached consensus spanning 10.3 seconds of consensus time. The latest round to reach consensus is round 363. Replay took 503.0 milliseconds.
node4 5m 59.021s 2025-09-30 05:50:59.418 155 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 5m 59.021s 2025-09-30 05:50:59.418 156 INFO PLATFORM_STATUS <platformForkJoinThread-2> StatusStateMachine: Platform spent 498.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 5m 59.948s 2025-09-30 05:51:00.345 279 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=363,ancientThreshold=336,expiredThreshold=312] remote ev=EventWindow[latestConsensusRound=751,ancientThreshold=724,expiredThreshold=650]
node0 6.000m 2025-09-30 05:51:00.416 8757 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=751,ancientThreshold=724,expiredThreshold=650] remote ev=EventWindow[latestConsensusRound=363,ancientThreshold=336,expiredThreshold=312]
node1 6.000m 2025-09-30 05:51:00.416 8772 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=752,ancientThreshold=725,expiredThreshold=651] remote ev=EventWindow[latestConsensusRound=363,ancientThreshold=336,expiredThreshold=312]
node2 6.000m 2025-09-30 05:51:00.416 8677 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=751,ancientThreshold=724,expiredThreshold=650] remote ev=EventWindow[latestConsensusRound=363,ancientThreshold=336,expiredThreshold=312]
node3 6.000m 2025-09-30 05:51:00.416 8625 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=751,ancientThreshold=724,expiredThreshold=650] remote ev=EventWindow[latestConsensusRound=363,ancientThreshold=336,expiredThreshold=312]
node4 6.001m 2025-09-30 05:51:00.485 280 INFO RECONNECT <<platform-core: SyncProtocolWith3 4 to 3>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=363,ancientThreshold=336,expiredThreshold=312] remote ev=EventWindow[latestConsensusRound=751,ancientThreshold=724,expiredThreshold=650]
node4 6.001m 2025-09-30 05:51:00.485 281 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=363,ancientThreshold=336,expiredThreshold=312] remote ev=EventWindow[latestConsensusRound=751,ancientThreshold=724,expiredThreshold=650]
node4 6.001m 2025-09-30 05:51:00.486 282 INFO PLATFORM_STATUS <platformForkJoinThread-7> StatusStateMachine: Platform spent 1.1 s in OBSERVING. Now in BEHIND
node4 6.001m 2025-09-30 05:51:00.486 283 INFO RECONNECT <<platform-core: SyncProtocolWith1 4 to 1>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=363,ancientThreshold=336,expiredThreshold=312] remote ev=EventWindow[latestConsensusRound=752,ancientThreshold=725,expiredThreshold=651]
node4 6.002m 2025-09-30 05:51:00.487 284 INFO RECONNECT <platformForkJoinThread-6> ReconnectController: Starting ReconnectController
node4 6.002m 2025-09-30 05:51:00.487 285 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, stopping gossip
node4 6.004m 2025-09-30 05:51:00.639 286 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, start clearing queues
node4 6.004m 2025-09-30 05:51:00.641 287 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Queues have been cleared
node4 6.004m 2025-09-30 05:51:00.643 288 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: waiting for reconnect connection
node4 6.004m 2025-09-30 05:51:00.643 289 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: acquired reconnect connection
node3 6.006m 2025-09-30 05:51:00.734 8629 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Starting reconnect in the role of the sender {"receiving":false,"nodeId":3,"otherNodeId":4,"round":752} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node3 6.006m 2025-09-30 05:51:00.735 8630 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: The following state will be sent to the learner:
Round: 752 Timestamp: 2025-09-30T05:50:59.608106945Z Next consensus number: 23316 Legacy running event hash: 702c20e90dbbb87283cde297c2c2a4c9202d5120ae3362e93709a570c1a5ea11b29103f957b4306b8d3a68ab21a4a65b Legacy running event mnemonic: airport-brush-resist-vault Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1802824946 Root hash: 560221273bc3995112212f26797f2d8bdc41a7ecd699154d06859c3142b5641efb3e908d8702720a7885e0fb80437706 (root) ConsistencyTestingToolState / scare-ceiling-journey-sting 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 sting-envelope-sibling-purchase 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 948623268828485337 /3 butter-bridge-need-oxygen 4 StringLeaf 751 /4 harbor-liar-grape-crouch
node3 6.006m 2025-09-30 05:51:00.736 8631 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Sending signatures from nodes 0, 1, 3 (signing weight = 37500000000/50000000000) for state hash 560221273bc3995112212f26797f2d8bdc41a7ecd699154d06859c3142b5641efb3e908d8702720a7885e0fb80437706
node3 6.006m 2025-09-30 05:51:00.736 8632 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Starting synchronization in the role of the sender.
node3 6.006m 2025-09-30 05:51:00.741 8633 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node3 6.006m 2025-09-30 05:51:00.750 8634 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@2da25975 start run()
node4 6.007m 2025-09-30 05:51:00.803 290 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Starting reconnect in role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":3,"round":362} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node4 6.007m 2025-09-30 05:51:00.805 291 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Receiving signed state signatures
node4 6.007m 2025-09-30 05:51:00.806 292 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Received signatures from nodes 0, 1, 3
node4 6.007m 2025-09-30 05:51:00.809 293 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls receiveTree()
node4 6.007m 2025-09-30 05:51:00.810 294 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronizing tree
node4 6.007m 2025-09-30 05:51:00.810 295 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node1 6.007m 2025-09-30 05:51:00.811 8782 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 753 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 6.007m 2025-09-30 05:51:00.816 296 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@6f81f210 start run()
node4 6.007m 2025-09-30 05:51:00.822 297 INFO STARTUP <<work group learning-synchronizer: async-input-stream #0>> ConsistencyTestingToolState: New State Constructed.
node3 6.008m 2025-09-30 05:51:00.890 8657 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 753 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 6.008m 2025-09-30 05:51:00.902 8780 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 753 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 6.008m 2025-09-30 05:51:00.903 8660 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@2da25975 finish run()
node3 6.008m 2025-09-30 05:51:00.904 8661 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> TeachingSynchronizer: finished sending tree
node3 6.008m 2025-09-30 05:51:00.905 8662 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node3 6.008m 2025-09-30 05:51:00.906 8663 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@10ef3dab start run()
node2 6.009m 2025-09-30 05:51:00.937 8690 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 753 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 6.011m 2025-09-30 05:51:01.034 321 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 6.011m 2025-09-30 05:51:01.035 322 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 6.011m 2025-09-30 05:51:01.035 323 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@6f81f210 finish run()
node4 6.011m 2025-09-30 05:51:01.036 324 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 6.011m 2025-09-30 05:51:01.037 325 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 6.011m 2025-09-30 05:51:01.040 326 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@57387302 start run()
node4 6.012m 2025-09-30 05:51:01.098 327 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): firstLeafPath: 1 -> 1, lastLeafPath: 1 -> 1
node4 6.012m 2025-09-30 05:51:01.099 328 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): done
node4 6.012m 2025-09-30 05:51:01.101 329 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 6.012m 2025-09-30 05:51:01.102 330 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call nodeRemover.allNodesReceived()
node4 6.012m 2025-09-30 05:51:01.103 331 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived()
node4 6.012m 2025-09-30 05:51:01.103 332 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived(): done
node4 6.012m 2025-09-30 05:51:01.103 333 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call root.endLearnerReconnect()
node4 6.012m 2025-09-30 05:51:01.103 334 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call reconnectIterator.close()
node4 6.012m 2025-09-30 05:51:01.103 335 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call setHashPrivate()
node2 6.012m 2025-09-30 05:51:01.118 8693 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 753 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/753
node2 6.012m 2025-09-30 05:51:01.118 8694 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 753
node3 6.012m 2025-09-30 05:51:01.118 8664 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 753 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/753
node3 6.012m 2025-09-30 05:51:01.119 8665 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/38 for round 753
node1 6.013m 2025-09-30 05:51:01.153 8785 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 753 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/753
node1 6.013m 2025-09-30 05:51:01.154 8786 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 753
node3 6.013m 2025-09-30 05:51:01.171 8688 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@10ef3dab finish run()
node3 6.013m 2025-09-30 05:51:01.172 8691 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> TeachingSynchronizer: finished sending tree
node3 6.013m 2025-09-30 05:51:01.184 8695 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Finished synchronization in the role of the sender.
node2 6.013m 2025-09-30 05:51:01.203 8729 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 753
node2 6.013m 2025-09-30 05:51:01.204 8730 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 753 Timestamp: 2025-09-30T05:51:00.071071673Z Next consensus number: 23340 Legacy running event hash: 1d21a97f7843190838d09ae0e522be10136d455cdddca32b66f2580f0f59810322a90531948b3dbc41aaddf6c812b0a8 Legacy running event mnemonic: picnic-south-anchor-canal Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1946087348 Root hash: 39e588b91bcaa65a268f31559b33a91c492eed05072fad387cf6fcd259ef09d55834465d241bfa8f2169ca9e1dcc5724 (root) ConsistencyTestingToolState / device-trophy-junk-crumble 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 keep-receive-service-adapt 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf -2925130888924614976 /3 region-beef-chair-blanket 4 StringLeaf 752 /4 depart-grocery-kitten-erupt
node2 6.014m 2025-09-30 05:51:01.213 8731 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+45+16.662580277Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+49+11.525370091Z_seq1_minr474_maxr5474_orgn0.pces
node2 6.014m 2025-09-30 05:51:01.214 8732 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 726 File: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+49+11.525370091Z_seq1_minr474_maxr5474_orgn0.pces
node2 6.014m 2025-09-30 05:51:01.214 8733 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 6.014m 2025-09-30 05:51:01.217 8704 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/38 for round 753
node2 6.014m 2025-09-30 05:51:01.218 8734 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 6.014m 2025-09-30 05:51:01.219 8735 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 753 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/753 {"round":753,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/753/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 6.014m 2025-09-30 05:51:01.219 8705 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 753 Timestamp: 2025-09-30T05:51:00.071071673Z Next consensus number: 23340 Legacy running event hash: 1d21a97f7843190838d09ae0e522be10136d455cdddca32b66f2580f0f59810322a90531948b3dbc41aaddf6c812b0a8 Legacy running event mnemonic: picnic-south-anchor-canal Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1946087348 Root hash: 39e588b91bcaa65a268f31559b33a91c492eed05072fad387cf6fcd259ef09d55834465d241bfa8f2169ca9e1dcc5724 (root) ConsistencyTestingToolState / device-trophy-junk-crumble 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 keep-receive-service-adapt 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf -2925130888924614976 /3 region-beef-chair-blanket 4 StringLeaf 752 /4 depart-grocery-kitten-erupt
node2 6.014m 2025-09-30 05:51:01.220 8736 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/92
node3 6.014m 2025-09-30 05:51:01.225 8706 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+49+11.568413001Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+45+16.698864552Z_seq0_minr1_maxr501_orgn0.pces
node3 6.014m 2025-09-30 05:51:01.226 8707 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 726 File: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+49+11.568413001Z_seq1_minr474_maxr5474_orgn0.pces
node3 6.014m 2025-09-30 05:51:01.226 8708 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 6.014m 2025-09-30 05:51:01.230 8709 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 6.014m 2025-09-30 05:51:01.231 8710 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 753 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/753 {"round":753,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/753/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 6.014m 2025-09-30 05:51:01.232 8711 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/92
node1 6.014m 2025-09-30 05:51:01.239 8821 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 753
node1 6.014m 2025-09-30 05:51:01.241 8822 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 753 Timestamp: 2025-09-30T05:51:00.071071673Z Next consensus number: 23340 Legacy running event hash: 1d21a97f7843190838d09ae0e522be10136d455cdddca32b66f2580f0f59810322a90531948b3dbc41aaddf6c812b0a8 Legacy running event mnemonic: picnic-south-anchor-canal Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1946087348 Root hash: 39e588b91bcaa65a268f31559b33a91c492eed05072fad387cf6fcd259ef09d55834465d241bfa8f2169ca9e1dcc5724 (root) ConsistencyTestingToolState / device-trophy-junk-crumble 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 keep-receive-service-adapt 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf -2925130888924614976 /3 region-beef-chair-blanket 4 StringLeaf 752 /4 depart-grocery-kitten-erupt
node1 6.014m 2025-09-30 05:51:01.248 8823 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+45+16.529211986Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+49+11.558579248Z_seq1_minr474_maxr5474_orgn0.pces
node1 6.014m 2025-09-30 05:51:01.248 8824 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 726 File: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+49+11.558579248Z_seq1_minr474_maxr5474_orgn0.pces
node1 6.014m 2025-09-30 05:51:01.249 8825 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 6.014m 2025-09-30 05:51:01.254 8829 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 6.014m 2025-09-30 05:51:01.254 8830 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 753 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/753 {"round":753,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/753/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 6.014m 2025-09-30 05:51:01.256 8831 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/92
node4 6.015m 2025-09-30 05:51:01.276 345 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call postInit()
node4 6.015m 2025-09-30 05:51:01.276 347 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: endLearnerReconnect() complete
node4 6.015m 2025-09-30 05:51:01.277 348 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: close() complete
node4 6.015m 2025-09-30 05:51:01.277 349 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 6.015m 2025-09-30 05:51:01.278 350 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@57387302 finish run()
node4 6.015m 2025-09-30 05:51:01.278 351 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 6.015m 2025-09-30 05:51:01.278 352 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronization complete
node4 6.015m 2025-09-30 05:51:01.279 353 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls initialize()
node4 6.015m 2025-09-30 05:51:01.279 354 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initializing tree
node4 6.015m 2025-09-30 05:51:01.279 355 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initialization complete
node4 6.015m 2025-09-30 05:51:01.280 356 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls hash()
node4 6.015m 2025-09-30 05:51:01.280 357 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing tree
node4 6.015m 2025-09-30 05:51:01.281 358 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing complete
node4 6.015m 2025-09-30 05:51:01.281 359 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls logStatistics()
node4 6.015m 2025-09-30 05:51:01.285 360 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: Finished synchronization {"timeInSeconds":0.468,"hashTimeInSeconds":0.001,"initializationTimeInSeconds":0.0,"totalNodes":12,"leafNodes":7,"redundantLeafNodes":4,"internalNodes":5,"redundantInternalNodes":2} [com.swirlds.logging.legacy.payload.SynchronizationCompletePayload]
node4 6.015m 2025-09-30 05:51:01.286 361 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: ReconnectMapMetrics: transfersFromTeacher=12; transfersFromLearner=10; internalHashes=0; internalCleanHashes=0; internalData=0; internalCleanData=0; leafHashes=3; leafCleanHashes=3; leafData=7; leafCleanData=4
node4 6.015m 2025-09-30 05:51:01.286 362 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner is done synchronizing
node4 6.015m 2025-09-30 05:51:01.290 363 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Reconnect data usage report {"dataMegabytes":0.006054878234863281} [com.swirlds.logging.legacy.payload.ReconnectDataUsagePayload]
node4 6.015m 2025-09-30 05:51:01.295 364 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Finished reconnect in the role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":3,"round":752,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6.015m 2025-09-30 05:51:01.296 365 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Information for state received during reconnect:
Round: 752 Timestamp: 2025-09-30T05:50:59.608106945Z Next consensus number: 23316 Legacy running event hash: 702c20e90dbbb87283cde297c2c2a4c9202d5120ae3362e93709a570c1a5ea11b29103f957b4306b8d3a68ab21a4a65b Legacy running event mnemonic: airport-brush-resist-vault Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1802824946 Root hash: 560221273bc3995112212f26797f2d8bdc41a7ecd699154d06859c3142b5641efb3e908d8702720a7885e0fb80437706 (root) ConsistencyTestingToolState / scare-ceiling-journey-sting 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 sting-envelope-sibling-purchase 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 948623268828485337 /3 butter-bridge-need-oxygen 4 StringLeaf 751 /4 harbor-liar-grape-crouch
node4 6.015m 2025-09-30 05:51:01.298 367 DEBUG RECONNECT <<reconnect: reconnect-controller>> ReconnectStateLoader: `loadReconnectState` : reloading state
node4 6.015m 2025-09-30 05:51:01.298 368 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with state long 948623268828485337.
node4 6.015m 2025-09-30 05:51:01.299 369 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with 751 rounds handled.
node4 6.015m 2025-09-30 05:51:01.299 370 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6.015m 2025-09-30 05:51:01.299 371 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Log file found. Parsing previous history
node4 6.015m 2025-09-30 05:51:01.323 378 INFO STATE_TO_DISK <<reconnect: reconnect-controller>> DefaultSavedStateController: Signed state from round 752 created, will eventually be written to disk, for reason: RECONNECT
node4 6.015m 2025-09-30 05:51:01.324 379 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 836.0 ms in BEHIND. Now in RECONNECT_COMPLETE
node4 6.015m 2025-09-30 05:51:01.324 380 INFO STARTUP <platformForkJoinThread-6> Shadowgraph: Shadowgraph starting from expiration threshold 725
node4 6.015m 2025-09-30 05:51:01.325 383 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 752 state to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/752
node4 6.016m 2025-09-30 05:51:01.327 384 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/2 for round 752
node0 6.016m 2025-09-30 05:51:01.335 8783 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 753 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/753
node0 6.016m 2025-09-30 05:51:01.335 8785 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 753
node4 6.016m 2025-09-30 05:51:01.342 394 INFO EVENT_STREAM <<reconnect: reconnect-controller>> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 702c20e90dbbb87283cde297c2c2a4c9202d5120ae3362e93709a570c1a5ea11b29103f957b4306b8d3a68ab21a4a65b
node4 6.016m 2025-09-30 05:51:01.343 395 INFO STARTUP <platformForkJoinThread-5> PcesFileManager: Due to recent operations on this node, the local preconsensus event stream will have a discontinuity. The last file with the old origin round is 2025-09-30T05+45+16.781749331Z_seq0_minr1_maxr364_orgn0.pces. All future files will have an origin round of 752.
node3 6.016m 2025-09-30 05:51:01.364 8715 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Finished reconnect in the role of the sender. {"receiving":false,"nodeId":3,"otherNodeId":4,"round":752,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node0 6m 1.025s 2025-09-30 05:51:01.422 8823 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 753
node0 6m 1.027s 2025-09-30 05:51:01.424 8824 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 753 Timestamp: 2025-09-30T05:51:00.071071673Z Next consensus number: 23340 Legacy running event hash: 1d21a97f7843190838d09ae0e522be10136d455cdddca32b66f2580f0f59810322a90531948b3dbc41aaddf6c812b0a8 Legacy running event mnemonic: picnic-south-anchor-canal Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1946087348 Root hash: 39e588b91bcaa65a268f31559b33a91c492eed05072fad387cf6fcd259ef09d55834465d241bfa8f2169ca9e1dcc5724 (root) ConsistencyTestingToolState / device-trophy-junk-crumble 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 keep-receive-service-adapt 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf -2925130888924614976 /3 region-beef-chair-blanket 4 StringLeaf 752 /4 depart-grocery-kitten-erupt
node0 6m 1.034s 2025-09-30 05:51:01.431 8825 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+45+16.695999103Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+49+11.675097831Z_seq1_minr473_maxr5473_orgn0.pces
node0 6m 1.035s 2025-09-30 05:51:01.432 8826 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 726 File: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+49+11.675097831Z_seq1_minr473_maxr5473_orgn0.pces
node0 6m 1.035s 2025-09-30 05:51:01.432 8827 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 6m 1.039s 2025-09-30 05:51:01.436 8828 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 6m 1.040s 2025-09-30 05:51:01.437 8829 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 753 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/753 {"round":753,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/753/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 6m 1.041s 2025-09-30 05:51:01.438 8830 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/92
node4 6m 1.094s 2025-09-30 05:51:01.491 418 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/2 for round 752
node4 6m 1.097s 2025-09-30 05:51:01.494 419 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 752 Timestamp: 2025-09-30T05:50:59.608106945Z Next consensus number: 23316 Legacy running event hash: 702c20e90dbbb87283cde297c2c2a4c9202d5120ae3362e93709a570c1a5ea11b29103f957b4306b8d3a68ab21a4a65b Legacy running event mnemonic: airport-brush-resist-vault Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1802824946 Root hash: 560221273bc3995112212f26797f2d8bdc41a7ecd699154d06859c3142b5641efb3e908d8702720a7885e0fb80437706 (root) ConsistencyTestingToolState / scare-ceiling-journey-sting 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 sting-envelope-sibling-purchase 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 948623268828485337 /3 butter-bridge-need-oxygen 4 StringLeaf 751 /4 harbor-liar-grape-crouch
node4 6m 1.142s 2025-09-30 05:51:01.539 431 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/30/2025-09-30T05+45+16.781749331Z_seq0_minr1_maxr364_orgn0.pces
node4 6m 1.142s 2025-09-30 05:51:01.539 432 WARN STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: No preconsensus event files meeting specified criteria found to copy. Lower bound: 725
node4 6m 1.151s 2025-09-30 05:51:01.548 433 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 752 to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/752 {"round":752,"freezeState":false,"reason":"RECONNECT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/752/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 1.158s 2025-09-30 05:51:01.555 434 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 229.0 ms in RECONNECT_COMPLETE. Now in CHECKING
node4 6m 1.516s 2025-09-30 05:51:01.913 435 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 6m 1.518s 2025-09-30 05:51:01.915 436 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 6m 1.907s 2025-09-30 05:51:02.304 437 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:fa7d261b0e88 BR:750), num remaining: 3
node4 6m 1.908s 2025-09-30 05:51:02.305 438 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:984508e6b18d BR:750), num remaining: 2
node4 6m 1.909s 2025-09-30 05:51:02.306 439 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:50c004318860 BR:751), num remaining: 1
node4 6m 1.909s 2025-09-30 05:51:02.306 440 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:68620f36cf69 BR:751), num remaining: 0
node4 6m 1.951s 2025-09-30 05:51:02.348 442 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 753 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 6m 1.954s 2025-09-30 05:51:02.351 445 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 753 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/753
node4 6m 1.954s 2025-09-30 05:51:02.351 453 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 753
node4 6m 2.063s 2025-09-30 05:51:02.460 492 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 753
node4 6m 2.065s 2025-09-30 05:51:02.462 493 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 753 Timestamp: 2025-09-30T05:51:00.071071673Z Next consensus number: 23340 Legacy running event hash: 1d21a97f7843190838d09ae0e522be10136d455cdddca32b66f2580f0f59810322a90531948b3dbc41aaddf6c812b0a8 Legacy running event mnemonic: picnic-south-anchor-canal Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1946087348 Root hash: 39e588b91bcaa65a268f31559b33a91c492eed05072fad387cf6fcd259ef09d55834465d241bfa8f2169ca9e1dcc5724 (root) ConsistencyTestingToolState / device-trophy-junk-crumble 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 keep-receive-service-adapt 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf -2925130888924614976 /3 region-beef-chair-blanket 4 StringLeaf 752 /4 depart-grocery-kitten-erupt
node4 6m 2.076s 2025-09-30 05:51:02.473 494 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/09/30/2025-09-30T05+51+01.733953293Z_seq1_minr725_maxr1225_orgn752.pces Last file: data/saved/preconsensus-events/4/2025/09/30/2025-09-30T05+45+16.781749331Z_seq0_minr1_maxr364_orgn0.pces
node4 6m 2.077s 2025-09-30 05:51:02.474 495 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 726 File: data/saved/preconsensus-events/4/2025/09/30/2025-09-30T05+51+01.733953293Z_seq1_minr725_maxr1225_orgn752.pces
node4 6m 2.077s 2025-09-30 05:51:02.474 496 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 6m 2.080s 2025-09-30 05:51:02.477 497 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 6m 2.081s 2025-09-30 05:51:02.478 498 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 753 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/753 {"round":753,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/753/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 2.086s 2025-09-30 05:51:02.483 499 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2
node2 6m 5.161s 2025-09-30 05:51:05.558 8836 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=762,ancientThreshold=735,expiredThreshold=661] remote ev=EventWindow[latestConsensusRound=363,ancientThreshold=336,expiredThreshold=312]
node4 6m 5.232s 2025-09-30 05:51:05.629 587 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=363,ancientThreshold=336,expiredThreshold=312] remote ev=EventWindow[latestConsensusRound=762,ancientThreshold=735,expiredThreshold=661]
node4 6m 5.233s 2025-09-30 05:51:05.630 588 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> RpcPeerHandler: Latest event window is not really falling behind, will retry sync local ev=EventWindow[latestConsensusRound=762,ancientThreshold=735,expiredThreshold=725] remote ev=EventWindow[latestConsensusRound=762,ancientThreshold=735,expiredThreshold=661]
node4 6m 5.936s 2025-09-30 05:51:06.333 608 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 4.8 s in CHECKING. Now in ACTIVE
node1 7.012m 2025-09-30 05:52:01.101 10277 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 885 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 7.013m 2025-09-30 05:52:01.188 10178 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 885 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 7.013m 2025-09-30 05:52:01.194 10273 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 885 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 7.014m 2025-09-30 05:52:01.253 1916 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 885 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 7.015m 2025-09-30 05:52:01.310 10144 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 885 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 7m 1.012s 2025-09-30 05:52:01.409 10280 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 885 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/885
node2 7m 1.012s 2025-09-30 05:52:01.409 10181 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 885 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/885
node1 7m 1.013s 2025-09-30 05:52:01.410 10281 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 885
node2 7m 1.013s 2025-09-30 05:52:01.410 10182 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 885
node3 7m 1.013s 2025-09-30 05:52:01.410 10147 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 885 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/885
node3 7m 1.014s 2025-09-30 05:52:01.411 10148 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/44 for round 885
node3 7m 1.098s 2025-09-30 05:52:01.495 10181 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/44 for round 885
node1 7m 1.099s 2025-09-30 05:52:01.496 10314 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 885
node3 7m 1.100s 2025-09-30 05:52:01.497 10182 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 885 Timestamp: 2025-09-30T05:52:00.164654Z Next consensus number: 28053 Legacy running event hash: 8e5a10261e1e39e8e4ed9de99f0690a0814080ab22a76054cc7f036e9123668bc9df90263faa92890b6696ee3e6a1c8b Legacy running event mnemonic: crumble-age-sense-dial Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 560737404 Root hash: b1c5a25d13d15b3ac4a764ecdee271ff28191b2643b2246b0dcdc20292fcd90d37dc2694c6cb689aae15155e80a45c27 (root) ConsistencyTestingToolState / shuffle-refuse-kick-guard 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 dune-rally-erase-burger 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 7098177334146875313 /3 useful-laundry-shield-tackle 4 StringLeaf 884 /4 illness-coast-orient-flat
node1 7m 1.101s 2025-09-30 05:52:01.498 10315 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 885 Timestamp: 2025-09-30T05:52:00.164654Z Next consensus number: 28053 Legacy running event hash: 8e5a10261e1e39e8e4ed9de99f0690a0814080ab22a76054cc7f036e9123668bc9df90263faa92890b6696ee3e6a1c8b Legacy running event mnemonic: crumble-age-sense-dial Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 560737404 Root hash: b1c5a25d13d15b3ac4a764ecdee271ff28191b2643b2246b0dcdc20292fcd90d37dc2694c6cb689aae15155e80a45c27 (root) ConsistencyTestingToolState / shuffle-refuse-kick-guard 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 dune-rally-erase-burger 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 7098177334146875313 /3 useful-laundry-shield-tackle 4 StringLeaf 884 /4 illness-coast-orient-flat
node2 7m 1.103s 2025-09-30 05:52:01.500 10215 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 885
node3 7m 1.107s 2025-09-30 05:52:01.504 10183 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+49+11.568413001Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+45+16.698864552Z_seq0_minr1_maxr501_orgn0.pces
node3 7m 1.107s 2025-09-30 05:52:01.504 10184 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 858 File: data/saved/preconsensus-events/3/2025/09/30/2025-09-30T05+49+11.568413001Z_seq1_minr474_maxr5474_orgn0.pces
node3 7m 1.107s 2025-09-30 05:52:01.504 10185 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 7m 1.108s 2025-09-30 05:52:01.505 10216 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 885 Timestamp: 2025-09-30T05:52:00.164654Z Next consensus number: 28053 Legacy running event hash: 8e5a10261e1e39e8e4ed9de99f0690a0814080ab22a76054cc7f036e9123668bc9df90263faa92890b6696ee3e6a1c8b Legacy running event mnemonic: crumble-age-sense-dial Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 560737404 Root hash: b1c5a25d13d15b3ac4a764ecdee271ff28191b2643b2246b0dcdc20292fcd90d37dc2694c6cb689aae15155e80a45c27 (root) ConsistencyTestingToolState / shuffle-refuse-kick-guard 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 dune-rally-erase-burger 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 7098177334146875313 /3 useful-laundry-shield-tackle 4 StringLeaf 884 /4 illness-coast-orient-flat
node1 7m 1.110s 2025-09-30 05:52:01.507 10316 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+45+16.529211986Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+49+11.558579248Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 1.111s 2025-09-30 05:52:01.508 10317 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 858 File: data/saved/preconsensus-events/1/2025/09/30/2025-09-30T05+49+11.558579248Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 1.111s 2025-09-30 05:52:01.508 10318 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 7m 1.114s 2025-09-30 05:52:01.511 10186 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 7m 1.115s 2025-09-30 05:52:01.512 10217 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+45+16.662580277Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+49+11.525370091Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 1.115s 2025-09-30 05:52:01.512 10218 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 858 File: data/saved/preconsensus-events/2/2025/09/30/2025-09-30T05+49+11.525370091Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 1.115s 2025-09-30 05:52:01.512 10219 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 7m 1.115s 2025-09-30 05:52:01.512 10187 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 885 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/885 {"round":885,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/885/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 7m 1.116s 2025-09-30 05:52:01.513 10188 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/213
node1 7m 1.118s 2025-09-30 05:52:01.515 10319 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 7m 1.119s 2025-09-30 05:52:01.516 10320 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 885 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/885 {"round":885,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/885/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 7m 1.120s 2025-09-30 05:52:01.517 10321 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/213
node2 7m 1.122s 2025-09-30 05:52:01.519 10220 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 7m 1.123s 2025-09-30 05:52:01.520 10221 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 885 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/885 {"round":885,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/885/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 7m 1.124s 2025-09-30 05:52:01.521 10222 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/213
node4 7m 1.153s 2025-09-30 05:52:01.550 1929 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 885 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/885
node4 7m 1.153s 2025-09-30 05:52:01.550 1930 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/14 for round 885
node0 7m 1.229s 2025-09-30 05:52:01.626 10286 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 885 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/885
node0 7m 1.230s 2025-09-30 05:52:01.627 10288 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 885
node4 7m 1.260s 2025-09-30 05:52:01.657 1972 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/14 for round 885
node4 7m 1.263s 2025-09-30 05:52:01.660 1973 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 885 Timestamp: 2025-09-30T05:52:00.164654Z Next consensus number: 28053 Legacy running event hash: 8e5a10261e1e39e8e4ed9de99f0690a0814080ab22a76054cc7f036e9123668bc9df90263faa92890b6696ee3e6a1c8b Legacy running event mnemonic: crumble-age-sense-dial Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 560737404 Root hash: b1c5a25d13d15b3ac4a764ecdee271ff28191b2643b2246b0dcdc20292fcd90d37dc2694c6cb689aae15155e80a45c27 (root) ConsistencyTestingToolState / shuffle-refuse-kick-guard 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 dune-rally-erase-burger 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 7098177334146875313 /3 useful-laundry-shield-tackle 4 StringLeaf 884 /4 illness-coast-orient-flat
node4 7m 1.273s 2025-09-30 05:52:01.670 1974 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/09/30/2025-09-30T05+51+01.733953293Z_seq1_minr725_maxr1225_orgn752.pces Last file: data/saved/preconsensus-events/4/2025/09/30/2025-09-30T05+45+16.781749331Z_seq0_minr1_maxr364_orgn0.pces
node4 7m 1.273s 2025-09-30 05:52:01.670 1975 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 858 File: data/saved/preconsensus-events/4/2025/09/30/2025-09-30T05+51+01.733953293Z_seq1_minr725_maxr1225_orgn752.pces
node4 7m 1.273s 2025-09-30 05:52:01.670 1976 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 7m 1.277s 2025-09-30 05:52:01.674 1977 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 7m 1.277s 2025-09-30 05:52:01.674 1978 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 885 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/885 {"round":885,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/885/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 7m 1.279s 2025-09-30 05:52:01.676 1979 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/92
node0 7m 1.320s 2025-09-30 05:52:01.717 10326 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 885
node0 7m 1.322s 2025-09-30 05:52:01.719 10327 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 885 Timestamp: 2025-09-30T05:52:00.164654Z Next consensus number: 28053 Legacy running event hash: 8e5a10261e1e39e8e4ed9de99f0690a0814080ab22a76054cc7f036e9123668bc9df90263faa92890b6696ee3e6a1c8b Legacy running event mnemonic: crumble-age-sense-dial Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 560737404 Root hash: b1c5a25d13d15b3ac4a764ecdee271ff28191b2643b2246b0dcdc20292fcd90d37dc2694c6cb689aae15155e80a45c27 (root) ConsistencyTestingToolState / shuffle-refuse-kick-guard 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 dune-rally-erase-burger 1 SingletonNode RosterService.ROSTER_STATE /1 canoe-sister-hurdle-curve 2 VirtualMap RosterService.ROSTERS /2 furnace-sea-flash-job 3 StringLeaf 7098177334146875313 /3 useful-laundry-shield-tackle 4 StringLeaf 884 /4 illness-coast-orient-flat
node0 7m 1.335s 2025-09-30 05:52:01.732 10336 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+45+16.695999103Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+49+11.675097831Z_seq1_minr473_maxr5473_orgn0.pces
node0 7m 1.335s 2025-09-30 05:52:01.732 10337 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 858 File: data/saved/preconsensus-events/0/2025/09/30/2025-09-30T05+49+11.675097831Z_seq1_minr473_maxr5473_orgn0.pces
node0 7m 1.335s 2025-09-30 05:52:01.732 10338 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 7m 1.343s 2025-09-30 05:52:01.740 10339 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 7m 1.344s 2025-09-30 05:52:01.741 10340 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 885 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/885 {"round":885,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/885/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7m 1.346s 2025-09-30 05:52:01.743 10341 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/213