Node ID







Columns











Log Level





Log Marker








Class



















































node1 0.000ns 2025-09-26 03:11:02.259 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node1 88.000ms 2025-09-26 03:11:02.347 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node1 103.000ms 2025-09-26 03:11:02.362 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 107.000ms 2025-09-26 03:11:02.366 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node2 194.000ms 2025-09-26 03:11:02.453 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node2 209.000ms 2025-09-26 03:11:02.468 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 216.000ms 2025-09-26 03:11:02.475 4 INFO STARTUP <main> Browser: The following nodes [1] are set to run locally
node1 223.000ms 2025-09-26 03:11:02.482 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node1 235.000ms 2025-09-26 03:11:02.494 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 322.000ms 2025-09-26 03:11:02.581 4 INFO STARTUP <main> Browser: The following nodes [2] are set to run locally
node2 328.000ms 2025-09-26 03:11:02.587 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node2 341.000ms 2025-09-26 03:11:02.600 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 650.000ms 2025-09-26 03:11:02.909 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node1 651.000ms 2025-09-26 03:11:02.910 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node2 764.000ms 2025-09-26 03:11:03.023 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node2 765.000ms 2025-09-26 03:11:03.024 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 1.198s 2025-09-26 03:11:03.457 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 1.287s 2025-09-26 03:11:03.546 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 1.302s 2025-09-26 03:11:03.561 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 1.413s 2025-09-26 03:11:03.672 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 1.419s 2025-09-26 03:11:03.678 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 1.431s 2025-09-26 03:11:03.690 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 1.565s 2025-09-26 03:11:03.824 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 914ms
node1 1.573s 2025-09-26 03:11:03.832 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node1 1.575s 2025-09-26 03:11:03.834 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 1.628s 2025-09-26 03:11:03.887 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node1 1.693s 2025-09-26 03:11:03.952 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node1 1.694s 2025-09-26 03:11:03.953 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node2 1.715s 2025-09-26 03:11:03.974 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 949ms
node3 1.717s 2025-09-26 03:11:03.976 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node2 1.723s 2025-09-26 03:11:03.982 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node2 1.726s 2025-09-26 03:11:03.985 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 1.764s 2025-09-26 03:11:04.023 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node3 1.812s 2025-09-26 03:11:04.071 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node3 1.829s 2025-09-26 03:11:04.088 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 1.843s 2025-09-26 03:11:04.102 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node2 1.844s 2025-09-26 03:11:04.103 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 1.847s 2025-09-26 03:11:04.106 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 1.848s 2025-09-26 03:11:04.107 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node0 1.945s 2025-09-26 03:11:04.204 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node3 1.950s 2025-09-26 03:11:04.209 4 INFO STARTUP <main> Browser: The following nodes [3] are set to run locally
node3 1.958s 2025-09-26 03:11:04.217 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node3 1.970s 2025-09-26 03:11:04.229 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 2.052s 2025-09-26 03:11:04.311 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node0 2.071s 2025-09-26 03:11:04.330 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 2.210s 2025-09-26 03:11:04.469 4 INFO STARTUP <main> Browser: The following nodes [0] are set to run locally
node0 2.218s 2025-09-26 03:11:04.477 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node0 2.232s 2025-09-26 03:11:04.491 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 2.418s 2025-09-26 03:11:04.677 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node3 2.419s 2025-09-26 03:11:04.678 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 2.700s 2025-09-26 03:11:04.959 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 852ms
node4 2.708s 2025-09-26 03:11:04.967 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 2.711s 2025-09-26 03:11:04.970 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 2.731s 2025-09-26 03:11:04.990 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node0 2.733s 2025-09-26 03:11:04.992 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 2.750s 2025-09-26 03:11:05.009 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 2.814s 2025-09-26 03:11:05.073 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 2.815s 2025-09-26 03:11:05.074 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node3 3.551s 2025-09-26 03:11:05.810 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1132ms
node3 3.559s 2025-09-26 03:11:05.818 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node3 3.562s 2025-09-26 03:11:05.821 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 3.615s 2025-09-26 03:11:05.874 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node3 3.682s 2025-09-26 03:11:05.941 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node3 3.683s 2025-09-26 03:11:05.942 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node1 3.754s 2025-09-26 03:11:06.013 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node2 3.814s 2025-09-26 03:11:06.073 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node1 3.842s 2025-09-26 03:11:06.101 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 3.845s 2025-09-26 03:11:06.104 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node1 3.846s 2025-09-26 03:11:06.105 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 3.893s 2025-09-26 03:11:06.152 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 3.895s 2025-09-26 03:11:06.154 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node2 3.895s 2025-09-26 03:11:06.154 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 3.980s 2025-09-26 03:11:06.239 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1247ms
node0 3.994s 2025-09-26 03:11:06.253 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node0 3.998s 2025-09-26 03:11:06.257 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 4.048s 2025-09-26 03:11:06.307 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node0 4.127s 2025-09-26 03:11:06.386 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node0 4.130s 2025-09-26 03:11:06.389 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node1 4.629s 2025-09-26 03:11:06.888 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.632s 2025-09-26 03:11:06.891 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node1 4.638s 2025-09-26 03:11:06.897 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node1 4.651s 2025-09-26 03:11:06.910 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.653s 2025-09-26 03:11:06.912 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.661s 2025-09-26 03:11:06.920 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.665s 2025-09-26 03:11:06.924 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node2 4.671s 2025-09-26 03:11:06.930 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node2 4.683s 2025-09-26 03:11:06.942 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.686s 2025-09-26 03:11:06.945 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.864s 2025-09-26 03:11:07.123 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 4.952s 2025-09-26 03:11:07.211 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.954s 2025-09-26 03:11:07.213 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node4 4.955s 2025-09-26 03:11:07.214 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 5.752s 2025-09-26 03:11:08.011 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node1 5.775s 2025-09-26 03:11:08.034 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26356222] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=142140, randomLong=5897841277760518028, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=13390, randomLong=-8235305628978776760, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1116780, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms)
node4 5.781s 2025-09-26 03:11:08.040 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.784s 2025-09-26 03:11:08.043 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 5.791s 2025-09-26 03:11:08.050 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node4 5.801s 2025-09-26 03:11:08.060 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.804s 2025-09-26 03:11:08.063 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 5.809s 2025-09-26 03:11:08.068 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node2 5.811s 2025-09-26 03:11:08.070 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26421669] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=317320, randomLong=-6674519462129620005, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=14650, randomLong=1904089771225286269, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1442429, data=35, exception=null] OS Health Check Report - Complete (took 1025 ms)
node1 5.816s 2025-09-26 03:11:08.075 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node1 5.822s 2025-09-26 03:11:08.081 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node3 5.836s 2025-09-26 03:11:08.095 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 5.838s 2025-09-26 03:11:08.097 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node3 5.839s 2025-09-26 03:11:08.098 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 5.845s 2025-09-26 03:11:08.104 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node2 5.853s 2025-09-26 03:11:08.112 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node2 5.858s 2025-09-26 03:11:08.117 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node1 5.898s 2025-09-26 03:11:08.157 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IocxRg==", "port": 30124 }, { "ipAddressV4": "CoAAXA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "aMaPYA==", "port": 30125 }, { "ipAddressV4": "CoAAXw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjgW3A==", "port": 30126 }, { "ipAddressV4": "CoAAXQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Iogp+A==", "port": 30127 }, { "ipAddressV4": "CoAAXg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Ih0pyA==", "port": 30128 }, { "ipAddressV4": "CoAAYA==", "port": 30128 }] }] }
node1 5.920s 2025-09-26 03:11:08.179 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/1/ConsistencyTestLog.csv
node1 5.920s 2025-09-26 03:11:08.179 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node1 5.934s 2025-09-26 03:11:08.193 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 97d91d16d726634dd9c7dfd82761a869a1544e0a81676a973a13c9f60152666e7e053779f3cc036ccfdbe172c6772ff4 (root) ConsistencyTestingToolState / agent-people-brave-have 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing
node2 5.941s 2025-09-26 03:11:08.200 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IocxRg==", "port": 30124 }, { "ipAddressV4": "CoAAXA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "aMaPYA==", "port": 30125 }, { "ipAddressV4": "CoAAXw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjgW3A==", "port": 30126 }, { "ipAddressV4": "CoAAXQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Iogp+A==", "port": 30127 }, { "ipAddressV4": "CoAAXg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Ih0pyA==", "port": 30128 }, { "ipAddressV4": "CoAAYA==", "port": 30128 }] }] }
node2 5.963s 2025-09-26 03:11:08.222 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/2/ConsistencyTestLog.csv
node2 5.963s 2025-09-26 03:11:08.222 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node2 5.978s 2025-09-26 03:11:08.237 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 97d91d16d726634dd9c7dfd82761a869a1544e0a81676a973a13c9f60152666e7e053779f3cc036ccfdbe172c6772ff4 (root) ConsistencyTestingToolState / agent-people-brave-have 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing
node1 6.142s 2025-09-26 03:11:08.401 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node1 6.147s 2025-09-26 03:11:08.406 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node1 6.151s 2025-09-26 03:11:08.410 47 INFO STARTUP <<start-node-1>> ConsistencyTestingToolMain: init called in Main for node 1.
node1 6.151s 2025-09-26 03:11:08.410 48 INFO STARTUP <<start-node-1>> SwirldsPlatform: Starting platform 1
node1 6.152s 2025-09-26 03:11:08.411 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node1 6.156s 2025-09-26 03:11:08.415 50 INFO STARTUP <<start-node-1>> CycleFinder: No cyclical back pressure detected in wiring model.
node1 6.157s 2025-09-26 03:11:08.416 51 INFO STARTUP <<start-node-1>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node1 6.158s 2025-09-26 03:11:08.417 52 INFO STARTUP <<start-node-1>> InputWireChecks: All input wires have been bound.
node1 6.159s 2025-09-26 03:11:08.418 53 WARN STARTUP <<start-node-1>> PcesFileTracker: No preconsensus event files available
node1 6.160s 2025-09-26 03:11:08.419 54 INFO STARTUP <<start-node-1>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node1 6.161s 2025-09-26 03:11:08.420 55 INFO STARTUP <<start-node-1>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node1 6.162s 2025-09-26 03:11:08.421 56 INFO STARTUP <<app: appMain 1>> ConsistencyTestingToolMain: run called in Main.
node1 6.163s 2025-09-26 03:11:08.422 57 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 172.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node1 6.168s 2025-09-26 03:11:08.427 58 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node2 6.185s 2025-09-26 03:11:08.444 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node2 6.189s 2025-09-26 03:11:08.448 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node2 6.195s 2025-09-26 03:11:08.454 47 INFO STARTUP <<start-node-2>> ConsistencyTestingToolMain: init called in Main for node 2.
node2 6.195s 2025-09-26 03:11:08.454 48 INFO STARTUP <<start-node-2>> SwirldsPlatform: Starting platform 2
node2 6.197s 2025-09-26 03:11:08.456 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node2 6.200s 2025-09-26 03:11:08.459 50 INFO STARTUP <<start-node-2>> CycleFinder: No cyclical back pressure detected in wiring model.
node2 6.202s 2025-09-26 03:11:08.461 51 INFO STARTUP <<start-node-2>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node2 6.202s 2025-09-26 03:11:08.461 52 INFO STARTUP <<start-node-2>> InputWireChecks: All input wires have been bound.
node2 6.204s 2025-09-26 03:11:08.463 53 WARN STARTUP <<start-node-2>> PcesFileTracker: No preconsensus event files available
node2 6.204s 2025-09-26 03:11:08.463 54 INFO STARTUP <<start-node-2>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node2 6.205s 2025-09-26 03:11:08.464 55 INFO STARTUP <<start-node-2>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node2 6.207s 2025-09-26 03:11:08.466 56 INFO STARTUP <<app: appMain 2>> ConsistencyTestingToolMain: run called in Main.
node2 6.208s 2025-09-26 03:11:08.467 57 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 174.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node2 6.212s 2025-09-26 03:11:08.471 58 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node0 6.298s 2025-09-26 03:11:08.557 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node0 6.398s 2025-09-26 03:11:08.657 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 6.401s 2025-09-26 03:11:08.660 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node0 6.402s 2025-09-26 03:11:08.661 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 6.640s 2025-09-26 03:11:08.899 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 6.643s 2025-09-26 03:11:08.902 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node3 6.650s 2025-09-26 03:11:08.909 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node3 6.662s 2025-09-26 03:11:08.921 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 6.663s 2025-09-26 03:11:08.922 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 6.918s 2025-09-26 03:11:09.177 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26359313] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=132640, randomLong=2652852010824795313, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10140, randomLong=-6499448576284126096, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1005860, data=35, exception=null] OS Health Check Report - Complete (took 1020 ms)
node4 6.948s 2025-09-26 03:11:09.207 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 6.955s 2025-09-26 03:11:09.214 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 6.960s 2025-09-26 03:11:09.219 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 7.037s 2025-09-26 03:11:09.296 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IocxRg==", "port": 30124 }, { "ipAddressV4": "CoAAXA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "aMaPYA==", "port": 30125 }, { "ipAddressV4": "CoAAXw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjgW3A==", "port": 30126 }, { "ipAddressV4": "CoAAXQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Iogp+A==", "port": 30127 }, { "ipAddressV4": "CoAAXg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Ih0pyA==", "port": 30128 }, { "ipAddressV4": "CoAAYA==", "port": 30128 }] }] }
node4 7.057s 2025-09-26 03:11:09.316 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 7.058s 2025-09-26 03:11:09.317 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node4 7.072s 2025-09-26 03:11:09.331 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 97d91d16d726634dd9c7dfd82761a869a1544e0a81676a973a13c9f60152666e7e053779f3cc036ccfdbe172c6772ff4 (root) ConsistencyTestingToolState / agent-people-brave-have 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing
node4 7.259s 2025-09-26 03:11:09.518 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node4 7.264s 2025-09-26 03:11:09.523 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node4 7.268s 2025-09-26 03:11:09.527 47 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 7.269s 2025-09-26 03:11:09.528 48 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 7.270s 2025-09-26 03:11:09.529 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 7.274s 2025-09-26 03:11:09.533 50 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 7.275s 2025-09-26 03:11:09.534 51 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 7.276s 2025-09-26 03:11:09.535 52 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 7.277s 2025-09-26 03:11:09.536 53 WARN STARTUP <<start-node-4>> PcesFileTracker: No preconsensus event files available
node4 7.278s 2025-09-26 03:11:09.537 54 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node4 7.279s 2025-09-26 03:11:09.538 55 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node4 7.280s 2025-09-26 03:11:09.539 56 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 7.281s 2025-09-26 03:11:09.540 57 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 152.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 7.286s 2025-09-26 03:11:09.545 58 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node0 7.373s 2025-09-26 03:11:09.632 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 7.376s 2025-09-26 03:11:09.635 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node0 7.386s 2025-09-26 03:11:09.645 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node0 7.402s 2025-09-26 03:11:09.661 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 7.404s 2025-09-26 03:11:09.663 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 7.801s 2025-09-26 03:11:10.060 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26322604] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=241139, randomLong=83312075591256070, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10191, randomLong=3860808460439579585, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1517026, data=35, exception=null] OS Health Check Report - Complete (took 1028 ms)
node3 7.840s 2025-09-26 03:11:10.099 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node3 7.850s 2025-09-26 03:11:10.109 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node3 7.856s 2025-09-26 03:11:10.115 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node3 7.951s 2025-09-26 03:11:10.210 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IocxRg==", "port": 30124 }, { "ipAddressV4": "CoAAXA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "aMaPYA==", "port": 30125 }, { "ipAddressV4": "CoAAXw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjgW3A==", "port": 30126 }, { "ipAddressV4": "CoAAXQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Iogp+A==", "port": 30127 }, { "ipAddressV4": "CoAAXg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Ih0pyA==", "port": 30128 }, { "ipAddressV4": "CoAAYA==", "port": 30128 }] }] }
node3 7.978s 2025-09-26 03:11:10.237 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/3/ConsistencyTestLog.csv
node3 7.978s 2025-09-26 03:11:10.237 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node3 7.994s 2025-09-26 03:11:10.253 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 97d91d16d726634dd9c7dfd82761a869a1544e0a81676a973a13c9f60152666e7e053779f3cc036ccfdbe172c6772ff4 (root) ConsistencyTestingToolState / agent-people-brave-have 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing
node3 8.204s 2025-09-26 03:11:10.463 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node3 8.210s 2025-09-26 03:11:10.469 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node3 8.215s 2025-09-26 03:11:10.474 47 INFO STARTUP <<start-node-3>> ConsistencyTestingToolMain: init called in Main for node 3.
node3 8.216s 2025-09-26 03:11:10.475 48 INFO STARTUP <<start-node-3>> SwirldsPlatform: Starting platform 3
node3 8.217s 2025-09-26 03:11:10.476 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node3 8.221s 2025-09-26 03:11:10.480 50 INFO STARTUP <<start-node-3>> CycleFinder: No cyclical back pressure detected in wiring model.
node3 8.222s 2025-09-26 03:11:10.481 51 INFO STARTUP <<start-node-3>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node3 8.222s 2025-09-26 03:11:10.481 52 INFO STARTUP <<start-node-3>> InputWireChecks: All input wires have been bound.
node3 8.224s 2025-09-26 03:11:10.483 53 WARN STARTUP <<start-node-3>> PcesFileTracker: No preconsensus event files available
node3 8.224s 2025-09-26 03:11:10.483 54 INFO STARTUP <<start-node-3>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node3 8.226s 2025-09-26 03:11:10.485 55 INFO STARTUP <<start-node-3>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node3 8.227s 2025-09-26 03:11:10.486 56 INFO STARTUP <<app: appMain 3>> ConsistencyTestingToolMain: run called in Main.
node3 8.228s 2025-09-26 03:11:10.487 57 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 174.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node3 8.233s 2025-09-26 03:11:10.492 58 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node0 8.535s 2025-09-26 03:11:10.794 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26062097] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=324889, randomLong=-964526029232223222, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=22310, randomLong=-1653834924284432239, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1536206, data=35, exception=null] OS Health Check Report - Complete (took 1031 ms)
node0 8.577s 2025-09-26 03:11:10.836 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node0 8.589s 2025-09-26 03:11:10.848 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node0 8.596s 2025-09-26 03:11:10.855 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node0 8.708s 2025-09-26 03:11:10.967 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IocxRg==", "port": 30124 }, { "ipAddressV4": "CoAAXA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "aMaPYA==", "port": 30125 }, { "ipAddressV4": "CoAAXw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjgW3A==", "port": 30126 }, { "ipAddressV4": "CoAAXQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Iogp+A==", "port": 30127 }, { "ipAddressV4": "CoAAXg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Ih0pyA==", "port": 30128 }, { "ipAddressV4": "CoAAYA==", "port": 30128 }] }] }
node0 8.735s 2025-09-26 03:11:10.994 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/0/ConsistencyTestLog.csv
node0 8.736s 2025-09-26 03:11:10.995 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node0 8.756s 2025-09-26 03:11:11.015 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 97d91d16d726634dd9c7dfd82761a869a1544e0a81676a973a13c9f60152666e7e053779f3cc036ccfdbe172c6772ff4 (root) ConsistencyTestingToolState / agent-people-brave-have 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing
node0 9.038s 2025-09-26 03:11:11.297 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node0 9.044s 2025-09-26 03:11:11.303 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node0 9.051s 2025-09-26 03:11:11.310 47 INFO STARTUP <<start-node-0>> ConsistencyTestingToolMain: init called in Main for node 0.
node0 9.052s 2025-09-26 03:11:11.311 48 INFO STARTUP <<start-node-0>> SwirldsPlatform: Starting platform 0
node0 9.053s 2025-09-26 03:11:11.312 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node0 9.058s 2025-09-26 03:11:11.317 50 INFO STARTUP <<start-node-0>> CycleFinder: No cyclical back pressure detected in wiring model.
node0 9.060s 2025-09-26 03:11:11.319 51 INFO STARTUP <<start-node-0>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node0 9.061s 2025-09-26 03:11:11.320 52 INFO STARTUP <<start-node-0>> InputWireChecks: All input wires have been bound.
node0 9.063s 2025-09-26 03:11:11.322 53 WARN STARTUP <<start-node-0>> PcesFileTracker: No preconsensus event files available
node0 9.063s 2025-09-26 03:11:11.322 54 INFO STARTUP <<start-node-0>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node0 9.065s 2025-09-26 03:11:11.324 55 INFO STARTUP <<start-node-0>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node0 9.067s 2025-09-26 03:11:11.326 56 INFO STARTUP <<app: appMain 0>> ConsistencyTestingToolMain: run called in Main.
node0 9.070s 2025-09-26 03:11:11.329 57 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 239.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node0 9.078s 2025-09-26 03:11:11.337 58 INFO PLATFORM_STATUS <platformForkJoinThread-5> DefaultStatusStateMachine: Platform spent 6.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node1 9.161s 2025-09-26 03:11:11.420 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting1.csv' ]
node1 9.163s 2025-09-26 03:11:11.422 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node2 9.208s 2025-09-26 03:11:11.467 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting2.csv' ]
node2 9.210s 2025-09-26 03:11:11.469 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 10.280s 2025-09-26 03:11:12.539 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 10.282s 2025-09-26 03:11:12.541 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node3 11.229s 2025-09-26 03:11:13.488 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting3.csv' ]
node3 11.232s 2025-09-26 03:11:13.491 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node0 12.066s 2025-09-26 03:11:14.325 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting0.csv' ]
node0 12.069s 2025-09-26 03:11:14.328 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node1 16.259s 2025-09-26 03:11:18.518 61 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node2 16.303s 2025-09-26 03:11:18.562 61 INFO PLATFORM_STATUS <platformForkJoinThread-7> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node4 17.377s 2025-09-26 03:11:19.636 61 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node3 18.324s 2025-09-26 03:11:20.583 61 INFO PLATFORM_STATUS <platformForkJoinThread-5> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node0 19.162s 2025-09-26 03:11:21.421 61 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node1 19.513s 2025-09-26 03:11:21.772 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node0 19.651s 2025-09-26 03:11:21.910 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node2 19.711s 2025-09-26 03:11:21.970 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node4 19.716s 2025-09-26 03:11:21.975 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node3 19.802s 2025-09-26 03:11:22.061 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node1 20.721s 2025-09-26 03:11:22.980 63 INFO PLATFORM_STATUS <platformForkJoinThread-5> DefaultStatusStateMachine: Platform spent 4.5 s in CHECKING. Now in ACTIVE
node1 20.724s 2025-09-26 03:11:22.983 65 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node2 20.776s 2025-09-26 03:11:23.035 63 INFO PLATFORM_STATUS <platformForkJoinThread-5> DefaultStatusStateMachine: Platform spent 4.5 s in CHECKING. Now in ACTIVE
node2 20.779s 2025-09-26 03:11:23.038 65 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node3 20.846s 2025-09-26 03:11:23.105 63 INFO PLATFORM_STATUS <platformForkJoinThread-8> DefaultStatusStateMachine: Platform spent 2.5 s in CHECKING. Now in ACTIVE
node3 20.849s 2025-09-26 03:11:23.108 65 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node0 20.864s 2025-09-26 03:11:23.123 64 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node0 20.867s 2025-09-26 03:11:23.126 81 INFO PLATFORM_STATUS <platformForkJoinThread-7> DefaultStatusStateMachine: Platform spent 1.7 s in CHECKING. Now in ACTIVE
node4 20.970s 2025-09-26 03:11:23.229 63 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 3.6 s in CHECKING. Now in ACTIVE
node4 20.974s 2025-09-26 03:11:23.233 65 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node2 21.119s 2025-09-26 03:11:23.378 84 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2
node2 21.122s 2025-09-26 03:11:23.381 85 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node4 21.127s 2025-09-26 03:11:23.386 84 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2
node4 21.130s 2025-09-26 03:11:23.389 85 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node1 21.190s 2025-09-26 03:11:23.449 84 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2
node1 21.192s 2025-09-26 03:11:23.451 85 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node0 21.207s 2025-09-26 03:11:23.466 84 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2
node0 21.209s 2025-09-26 03:11:23.468 85 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node3 21.262s 2025-09-26 03:11:23.521 84 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2
node3 21.265s 2025-09-26 03:11:23.524 85 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node2 21.380s 2025-09-26 03:11:23.639 120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node2 21.383s 2025-09-26 03:11:23.642 121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-26T03:11:21.175408Z Next consensus number: 12 Legacy running event hash: 6dc1ae23cc28296c809805b9a54904b5941eaf646adaaa40541547d7319f5d7a4d4b83d45793813739e27e0802e4ffa6 Legacy running event mnemonic: return-desert-kitchen-collect Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: 1af439f8abd041df41d9d8467835682f5a2ec9d79c9718589e55fd7d5826021f96c0958f49d025470bc92a030471222a (root) ConsistencyTestingToolState / future-disease-maid-desk 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 will-annual-input-round 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node4 21.384s 2025-09-26 03:11:23.643 120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node4 21.388s 2025-09-26 03:11:23.647 121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-26T03:11:21.175408Z Next consensus number: 12 Legacy running event hash: 6dc1ae23cc28296c809805b9a54904b5941eaf646adaaa40541547d7319f5d7a4d4b83d45793813739e27e0802e4ffa6 Legacy running event mnemonic: return-desert-kitchen-collect Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: 1af439f8abd041df41d9d8467835682f5a2ec9d79c9718589e55fd7d5826021f96c0958f49d025470bc92a030471222a (root) ConsistencyTestingToolState / future-disease-maid-desk 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 will-annual-input-round 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node2 21.416s 2025-09-26 03:11:23.675 122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+11+18.596726997Z_seq0_minr1_maxr501_orgn0.pces
node2 21.416s 2025-09-26 03:11:23.675 123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+11+18.596726997Z_seq0_minr1_maxr501_orgn0.pces
node2 21.416s 2025-09-26 03:11:23.675 124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 21.417s 2025-09-26 03:11:23.676 125 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 21.423s 2025-09-26 03:11:23.682 126 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 21.425s 2025-09-26 03:11:23.684 122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+11+18.869647062Z_seq0_minr1_maxr501_orgn0.pces
node4 21.426s 2025-09-26 03:11:23.685 123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+11+18.869647062Z_seq0_minr1_maxr501_orgn0.pces
node4 21.426s 2025-09-26 03:11:23.685 124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 21.427s 2025-09-26 03:11:23.686 125 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 21.433s 2025-09-26 03:11:23.692 126 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 21.450s 2025-09-26 03:11:23.709 124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node1 21.453s 2025-09-26 03:11:23.712 125 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-26T03:11:21.175408Z Next consensus number: 12 Legacy running event hash: 6dc1ae23cc28296c809805b9a54904b5941eaf646adaaa40541547d7319f5d7a4d4b83d45793813739e27e0802e4ffa6 Legacy running event mnemonic: return-desert-kitchen-collect Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: 1af439f8abd041df41d9d8467835682f5a2ec9d79c9718589e55fd7d5826021f96c0958f49d025470bc92a030471222a (root) ConsistencyTestingToolState / future-disease-maid-desk 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 will-annual-input-round 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node1 21.484s 2025-09-26 03:11:23.743 126 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+11+18.558338059Z_seq0_minr1_maxr501_orgn0.pces
node1 21.484s 2025-09-26 03:11:23.743 127 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+11+18.558338059Z_seq0_minr1_maxr501_orgn0.pces
node1 21.485s 2025-09-26 03:11:23.744 128 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 21.486s 2025-09-26 03:11:23.745 129 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 21.491s 2025-09-26 03:11:23.750 130 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 21.524s 2025-09-26 03:11:23.783 124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node0 21.528s 2025-09-26 03:11:23.787 125 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-26T03:11:21.175408Z Next consensus number: 12 Legacy running event hash: 6dc1ae23cc28296c809805b9a54904b5941eaf646adaaa40541547d7319f5d7a4d4b83d45793813739e27e0802e4ffa6 Legacy running event mnemonic: return-desert-kitchen-collect Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: 1af439f8abd041df41d9d8467835682f5a2ec9d79c9718589e55fd7d5826021f96c0958f49d025470bc92a030471222a (root) ConsistencyTestingToolState / future-disease-maid-desk 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 will-annual-input-round 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node3 21.543s 2025-09-26 03:11:23.802 124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node3 21.546s 2025-09-26 03:11:23.805 125 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-26T03:11:21.175408Z Next consensus number: 12 Legacy running event hash: 6dc1ae23cc28296c809805b9a54904b5941eaf646adaaa40541547d7319f5d7a4d4b83d45793813739e27e0802e4ffa6 Legacy running event mnemonic: return-desert-kitchen-collect Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: 1af439f8abd041df41d9d8467835682f5a2ec9d79c9718589e55fd7d5826021f96c0958f49d025470bc92a030471222a (root) ConsistencyTestingToolState / future-disease-maid-desk 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 will-annual-input-round 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node0 21.570s 2025-09-26 03:11:23.829 126 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+11+18.938316448Z_seq0_minr1_maxr501_orgn0.pces
node0 21.570s 2025-09-26 03:11:23.829 127 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+11+18.938316448Z_seq0_minr1_maxr501_orgn0.pces
node0 21.571s 2025-09-26 03:11:23.830 128 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 21.573s 2025-09-26 03:11:23.832 129 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 21.580s 2025-09-26 03:11:23.839 130 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 21.581s 2025-09-26 03:11:23.840 126 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+11+19.110292014Z_seq0_minr1_maxr501_orgn0.pces
node3 21.582s 2025-09-26 03:11:23.841 127 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+11+19.110292014Z_seq0_minr1_maxr501_orgn0.pces
node3 21.582s 2025-09-26 03:11:23.841 128 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 21.583s 2025-09-26 03:11:23.842 129 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 21.589s 2025-09-26 03:11:23.848 130 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 59.502s 2025-09-26 03:12:01.761 762 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 61 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 59.571s 2025-09-26 03:12:01.830 754 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 61 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 59.600s 2025-09-26 03:12:01.859 758 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 61 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 59.639s 2025-09-26 03:12:01.898 762 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 61 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 59.715s 2025-09-26 03:12:01.974 750 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 61 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 1.000m 2025-09-26 03:12:02.271 761 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 61 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/61
node1 1.000m 2025-09-26 03:12:02.272 762 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 61
node2 1.001m 2025-09-26 03:12:02.314 765 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 61 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/61
node2 1.001m 2025-09-26 03:12:02.314 766 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 61
node3 1.001m 2025-09-26 03:12:02.314 778 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 61 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/61
node3 1.001m 2025-09-26 03:12:02.314 779 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 61
node4 1.001m 2025-09-26 03:12:02.341 753 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 61 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/61
node4 1.001m 2025-09-26 03:12:02.342 754 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 61
node1 1.002m 2025-09-26 03:12:02.349 801 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 61
node1 1.002m 2025-09-26 03:12:02.352 802 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 61 Timestamp: 2025-09-26T03:12:00.457285Z Next consensus number: 1634 Legacy running event hash: e1645b7d387e62157114062d710b8a86ecef6c2b04db1892de93a75adb7e1755a70e7bc0cd98551cb27b2e489796fe47 Legacy running event mnemonic: squeeze-hurry-average-dress Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -840892473 Root hash: 8b5549f8aa18ddc643410c135d2f468b2f4dc354488577be2378d74b9d4113b4a40bffdf877630cc91d337c3711cb4e0 (root) ConsistencyTestingToolState / nice-disease-rug-elder 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 time-reunion-ostrich-any 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf -6019064657660791423 /3 nuclear-alley-wet-social 4 StringLeaf 60 /4 tomato-add-viable-into
node1 1.002m 2025-09-26 03:12:02.360 803 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+11+18.558338059Z_seq0_minr1_maxr501_orgn0.pces
node1 1.002m 2025-09-26 03:12:02.360 804 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 34 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+11+18.558338059Z_seq0_minr1_maxr501_orgn0.pces
node1 1.002m 2025-09-26 03:12:02.360 805 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 1.002m 2025-09-26 03:12:02.361 806 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 1.002m 2025-09-26 03:12:02.362 807 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 61 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/61 {"round":61,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/61/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 1.002m 2025-09-26 03:12:02.385 757 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 61 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/61
node0 1.002m 2025-09-26 03:12:02.386 759 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 61
node2 1.002m 2025-09-26 03:12:02.394 805 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 61
node2 1.002m 2025-09-26 03:12:02.397 806 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 61 Timestamp: 2025-09-26T03:12:00.457285Z Next consensus number: 1634 Legacy running event hash: e1645b7d387e62157114062d710b8a86ecef6c2b04db1892de93a75adb7e1755a70e7bc0cd98551cb27b2e489796fe47 Legacy running event mnemonic: squeeze-hurry-average-dress Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -840892473 Root hash: 8b5549f8aa18ddc643410c135d2f468b2f4dc354488577be2378d74b9d4113b4a40bffdf877630cc91d337c3711cb4e0 (root) ConsistencyTestingToolState / nice-disease-rug-elder 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 time-reunion-ostrich-any 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf -6019064657660791423 /3 nuclear-alley-wet-social 4 StringLeaf 60 /4 tomato-add-viable-into
node3 1.002m 2025-09-26 03:12:02.403 815 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 61
node2 1.002m 2025-09-26 03:12:02.405 807 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+11+18.596726997Z_seq0_minr1_maxr501_orgn0.pces
node2 1.002m 2025-09-26 03:12:02.405 808 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 34 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+11+18.596726997Z_seq0_minr1_maxr501_orgn0.pces
node2 1.002m 2025-09-26 03:12:02.405 809 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 1.002m 2025-09-26 03:12:02.406 816 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 61 Timestamp: 2025-09-26T03:12:00.457285Z Next consensus number: 1634 Legacy running event hash: e1645b7d387e62157114062d710b8a86ecef6c2b04db1892de93a75adb7e1755a70e7bc0cd98551cb27b2e489796fe47 Legacy running event mnemonic: squeeze-hurry-average-dress Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -840892473 Root hash: 8b5549f8aa18ddc643410c135d2f468b2f4dc354488577be2378d74b9d4113b4a40bffdf877630cc91d337c3711cb4e0 (root) ConsistencyTestingToolState / nice-disease-rug-elder 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 time-reunion-ostrich-any 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf -6019064657660791423 /3 nuclear-alley-wet-social 4 StringLeaf 60 /4 tomato-add-viable-into
node2 1.002m 2025-09-26 03:12:02.407 810 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 1.002m 2025-09-26 03:12:02.407 811 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 61 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/61 {"round":61,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/61/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 1.003m 2025-09-26 03:12:02.415 817 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+11+19.110292014Z_seq0_minr1_maxr501_orgn0.pces
node3 1.003m 2025-09-26 03:12:02.415 818 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 34 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+11+19.110292014Z_seq0_minr1_maxr501_orgn0.pces
node3 1.003m 2025-09-26 03:12:02.415 819 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 1.003m 2025-09-26 03:12:02.417 820 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 1.003m 2025-09-26 03:12:02.418 821 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 61 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/61 {"round":61,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/61/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 1.003m 2025-09-26 03:12:02.432 785 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 61
node4 1.003m 2025-09-26 03:12:02.436 786 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 61 Timestamp: 2025-09-26T03:12:00.457285Z Next consensus number: 1634 Legacy running event hash: e1645b7d387e62157114062d710b8a86ecef6c2b04db1892de93a75adb7e1755a70e7bc0cd98551cb27b2e489796fe47 Legacy running event mnemonic: squeeze-hurry-average-dress Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -840892473 Root hash: 8b5549f8aa18ddc643410c135d2f468b2f4dc354488577be2378d74b9d4113b4a40bffdf877630cc91d337c3711cb4e0 (root) ConsistencyTestingToolState / nice-disease-rug-elder 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 time-reunion-ostrich-any 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf -6019064657660791423 /3 nuclear-alley-wet-social 4 StringLeaf 60 /4 tomato-add-viable-into
node4 1.003m 2025-09-26 03:12:02.445 787 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+11+18.869647062Z_seq0_minr1_maxr501_orgn0.pces
node4 1.003m 2025-09-26 03:12:02.445 788 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 34 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+11+18.869647062Z_seq0_minr1_maxr501_orgn0.pces
node4 1.003m 2025-09-26 03:12:02.446 789 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 1.003m 2025-09-26 03:12:02.447 790 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 1.003m 2025-09-26 03:12:02.448 791 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 61 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/61 {"round":61,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/61/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 1.004m 2025-09-26 03:12:02.485 797 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 61
node0 1.004m 2025-09-26 03:12:02.489 798 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 61 Timestamp: 2025-09-26T03:12:00.457285Z Next consensus number: 1634 Legacy running event hash: e1645b7d387e62157114062d710b8a86ecef6c2b04db1892de93a75adb7e1755a70e7bc0cd98551cb27b2e489796fe47 Legacy running event mnemonic: squeeze-hurry-average-dress Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -840892473 Root hash: 8b5549f8aa18ddc643410c135d2f468b2f4dc354488577be2378d74b9d4113b4a40bffdf877630cc91d337c3711cb4e0 (root) ConsistencyTestingToolState / nice-disease-rug-elder 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 time-reunion-ostrich-any 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf -6019064657660791423 /3 nuclear-alley-wet-social 4 StringLeaf 60 /4 tomato-add-viable-into
node0 1.004m 2025-09-26 03:12:02.500 799 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+11+18.938316448Z_seq0_minr1_maxr501_orgn0.pces
node0 1.004m 2025-09-26 03:12:02.500 800 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 34 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+11+18.938316448Z_seq0_minr1_maxr501_orgn0.pces
node0 1.004m 2025-09-26 03:12:02.500 801 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 1.004m 2025-09-26 03:12:02.502 802 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 1.004m 2025-09-26 03:12:02.503 803 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 61 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/61 {"round":61,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/61/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 1m 59.340s 2025-09-26 03:13:01.599 1818 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 151 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 1m 59.414s 2025-09-26 03:13:01.673 1828 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 151 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 1m 59.471s 2025-09-26 03:13:01.730 1802 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 151 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 1m 59.559s 2025-09-26 03:13:01.818 1834 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 151 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 1m 59.559s 2025-09-26 03:13:01.818 1806 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 151 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 1m 59.718s 2025-09-26 03:13:01.977 1805 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 151 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/151
node0 1m 59.719s 2025-09-26 03:13:01.978 1806 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 151
node3 1m 59.822s 2025-09-26 03:13:02.081 1837 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 151 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/151
node3 1m 59.823s 2025-09-26 03:13:02.082 1846 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 151
node4 1m 59.824s 2025-09-26 03:13:02.083 1809 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 151 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/151
node4 1m 59.825s 2025-09-26 03:13:02.084 1810 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 151
node0 1m 59.829s 2025-09-26 03:13:02.088 1837 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 151
node0 1m 59.832s 2025-09-26 03:13:02.091 1838 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 151 Timestamp: 2025-09-26T03:13:00.552525088Z Next consensus number: 4114 Legacy running event hash: 201fcf70e7dfa91a27cde41bbeb8db1aace98a0377ecd1601b986a8ee9dbc2ed0f513aa250018b2411acf37294978460 Legacy running event mnemonic: advance-bamboo-aerobic-system Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -19148890 Root hash: a2d4cb0ceff10e08ff25cdfe1578dcf2d02bf1c961f8f95cd82edb4b10a0141dd60f822f40eedf0aaab9e8eaddcc42ae (root) ConsistencyTestingToolState / pitch-ozone-acid-scissors 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 raccoon-opinion-setup-room 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 1024460418094140552 /3 camp-hill-actor-image 4 StringLeaf 150 /4 dish-fringe-rotate-already
node0 1m 59.844s 2025-09-26 03:13:02.103 1839 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+11+18.938316448Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 59.844s 2025-09-26 03:13:02.103 1840 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 124 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+11+18.938316448Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 59.845s 2025-09-26 03:13:02.104 1841 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 1m 59.848s 2025-09-26 03:13:02.107 1842 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 1m 59.849s 2025-09-26 03:13:02.108 1843 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 151 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/151 {"round":151,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/151/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 1m 59.911s 2025-09-26 03:13:02.170 1831 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 151 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/151
node1 1m 59.911s 2025-09-26 03:13:02.170 1832 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 151
node4 1m 59.913s 2025-09-26 03:13:02.172 1845 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 151
node4 1m 59.915s 2025-09-26 03:13:02.174 1846 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 151 Timestamp: 2025-09-26T03:13:00.552525088Z Next consensus number: 4114 Legacy running event hash: 201fcf70e7dfa91a27cde41bbeb8db1aace98a0377ecd1601b986a8ee9dbc2ed0f513aa250018b2411acf37294978460 Legacy running event mnemonic: advance-bamboo-aerobic-system Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -19148890 Root hash: a2d4cb0ceff10e08ff25cdfe1578dcf2d02bf1c961f8f95cd82edb4b10a0141dd60f822f40eedf0aaab9e8eaddcc42ae (root) ConsistencyTestingToolState / pitch-ozone-acid-scissors 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 raccoon-opinion-setup-room 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 1024460418094140552 /3 camp-hill-actor-image 4 StringLeaf 150 /4 dish-fringe-rotate-already
node3 1m 59.916s 2025-09-26 03:13:02.175 1875 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 151
node3 1m 59.919s 2025-09-26 03:13:02.178 1876 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 151 Timestamp: 2025-09-26T03:13:00.552525088Z Next consensus number: 4114 Legacy running event hash: 201fcf70e7dfa91a27cde41bbeb8db1aace98a0377ecd1601b986a8ee9dbc2ed0f513aa250018b2411acf37294978460 Legacy running event mnemonic: advance-bamboo-aerobic-system Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -19148890 Root hash: a2d4cb0ceff10e08ff25cdfe1578dcf2d02bf1c961f8f95cd82edb4b10a0141dd60f822f40eedf0aaab9e8eaddcc42ae (root) ConsistencyTestingToolState / pitch-ozone-acid-scissors 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 raccoon-opinion-setup-room 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 1024460418094140552 /3 camp-hill-actor-image 4 StringLeaf 150 /4 dish-fringe-rotate-already
node4 1m 59.924s 2025-09-26 03:13:02.183 1847 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+11+18.869647062Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 59.924s 2025-09-26 03:13:02.183 1848 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 124 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+11+18.869647062Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 59.924s 2025-09-26 03:13:02.183 1849 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 1m 59.927s 2025-09-26 03:13:02.186 1850 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 1m 59.927s 2025-09-26 03:13:02.186 1851 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 151 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/151 {"round":151,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/151/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 1m 59.928s 2025-09-26 03:13:02.187 1877 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+11+19.110292014Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 59.928s 2025-09-26 03:13:02.187 1878 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 124 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+11+19.110292014Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 59.929s 2025-09-26 03:13:02.188 1879 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 1m 59.933s 2025-09-26 03:13:02.192 1880 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 1m 59.934s 2025-09-26 03:13:02.193 1881 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 151 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/151 {"round":151,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/151/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 1m 59.965s 2025-09-26 03:13:02.224 1831 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 151 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/151
node2 1m 59.965s 2025-09-26 03:13:02.224 1832 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 151
node1 2.000m 2025-09-26 03:13:02.259 1875 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 151
node1 2.000m 2025-09-26 03:13:02.260 1876 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 151 Timestamp: 2025-09-26T03:13:00.552525088Z Next consensus number: 4114 Legacy running event hash: 201fcf70e7dfa91a27cde41bbeb8db1aace98a0377ecd1601b986a8ee9dbc2ed0f513aa250018b2411acf37294978460 Legacy running event mnemonic: advance-bamboo-aerobic-system Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -19148890 Root hash: a2d4cb0ceff10e08ff25cdfe1578dcf2d02bf1c961f8f95cd82edb4b10a0141dd60f822f40eedf0aaab9e8eaddcc42ae (root) ConsistencyTestingToolState / pitch-ozone-acid-scissors 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 raccoon-opinion-setup-room 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 1024460418094140552 /3 camp-hill-actor-image 4 StringLeaf 150 /4 dish-fringe-rotate-already
node1 2.000m 2025-09-26 03:13:02.267 1877 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+11+18.558338059Z_seq0_minr1_maxr501_orgn0.pces
node1 2.000m 2025-09-26 03:13:02.267 1878 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 124 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+11+18.558338059Z_seq0_minr1_maxr501_orgn0.pces
node1 2.000m 2025-09-26 03:13:02.268 1879 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 2.000m 2025-09-26 03:13:02.271 1880 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 2.000m 2025-09-26 03:13:02.271 1881 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 151 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/151 {"round":151,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/151/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 2.001m 2025-09-26 03:13:02.312 1875 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 151
node2 2.001m 2025-09-26 03:13:02.315 1876 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 151 Timestamp: 2025-09-26T03:13:00.552525088Z Next consensus number: 4114 Legacy running event hash: 201fcf70e7dfa91a27cde41bbeb8db1aace98a0377ecd1601b986a8ee9dbc2ed0f513aa250018b2411acf37294978460 Legacy running event mnemonic: advance-bamboo-aerobic-system Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -19148890 Root hash: a2d4cb0ceff10e08ff25cdfe1578dcf2d02bf1c961f8f95cd82edb4b10a0141dd60f822f40eedf0aaab9e8eaddcc42ae (root) ConsistencyTestingToolState / pitch-ozone-acid-scissors 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 raccoon-opinion-setup-room 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 1024460418094140552 /3 camp-hill-actor-image 4 StringLeaf 150 /4 dish-fringe-rotate-already
node2 2.001m 2025-09-26 03:13:02.322 1877 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+11+18.596726997Z_seq0_minr1_maxr501_orgn0.pces
node2 2.001m 2025-09-26 03:13:02.322 1878 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 124 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+11+18.596726997Z_seq0_minr1_maxr501_orgn0.pces
node2 2.001m 2025-09-26 03:13:02.322 1879 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 2.001m 2025-09-26 03:13:02.325 1880 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 2.001m 2025-09-26 03:13:02.326 1881 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 151 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/151 {"round":151,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/151/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 2m 59.199s 2025-09-26 03:14:01.458 2899 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 244 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 2m 59.250s 2025-09-26 03:14:01.509 2931 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 244 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 2m 59.318s 2025-09-26 03:14:01.577 2885 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 244 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 2m 59.458s 2025-09-26 03:14:01.717 2887 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 244 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 2m 59.585s 2025-09-26 03:14:01.844 2925 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 244 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 2m 59.692s 2025-09-26 03:14:01.951 2912 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 244 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/244
node2 2m 59.692s 2025-09-26 03:14:01.951 2913 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 244
node4 2m 59.724s 2025-09-26 03:14:01.983 2890 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 244 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/244
node4 2m 59.725s 2025-09-26 03:14:01.984 2891 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 244
node1 2m 59.732s 2025-09-26 03:14:01.991 2928 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 244 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/244
node1 2m 59.733s 2025-09-26 03:14:01.992 2929 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 244
node0 2m 59.749s 2025-09-26 03:14:02.008 2888 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 244 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/244
node0 2m 59.750s 2025-09-26 03:14:02.009 2889 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 244
node2 2m 59.784s 2025-09-26 03:14:02.043 2944 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 244
node2 2m 59.786s 2025-09-26 03:14:02.045 2945 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 244 Timestamp: 2025-09-26T03:14:00.252134Z Next consensus number: 6632 Legacy running event hash: 44af34a3649465d7156413857917fb34ef26eca3130d3a1504478dcd0f586b133af0388ccc3ba35fab23430d3ac732e2 Legacy running event mnemonic: nominee-nerve-myth-road Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1804150919 Root hash: a0aa9423aef6f2d6eb3120b7ebf7c335fe3348d84fe5caf0aa4d424e39e382a935cf2f2fa36d52554a9fdf9608798f83 (root) ConsistencyTestingToolState / turn-knife-candy-pipe 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 donate-defense-faith-acoustic 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 9099383761427963870 /3 angle-shock-crush-duck 4 StringLeaf 243 /4 whip-ignore-path-auction
node2 2m 59.796s 2025-09-26 03:14:02.055 2946 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+11+18.596726997Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 59.796s 2025-09-26 03:14:02.055 2947 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 217 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+11+18.596726997Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 59.796s 2025-09-26 03:14:02.055 2948 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 2m 59.801s 2025-09-26 03:14:02.060 2949 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 2m 59.802s 2025-09-26 03:14:02.061 2950 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 244 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/244 {"round":244,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/244/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 2m 59.804s 2025-09-26 03:14:02.063 2944 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 244 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/244
node3 2m 59.805s 2025-09-26 03:14:02.064 2945 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 244
node4 2m 59.812s 2025-09-26 03:14:02.071 2922 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 244
node4 2m 59.814s 2025-09-26 03:14:02.073 2923 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 244 Timestamp: 2025-09-26T03:14:00.252134Z Next consensus number: 6632 Legacy running event hash: 44af34a3649465d7156413857917fb34ef26eca3130d3a1504478dcd0f586b133af0388ccc3ba35fab23430d3ac732e2 Legacy running event mnemonic: nominee-nerve-myth-road Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1804150919 Root hash: a0aa9423aef6f2d6eb3120b7ebf7c335fe3348d84fe5caf0aa4d424e39e382a935cf2f2fa36d52554a9fdf9608798f83 (root) ConsistencyTestingToolState / turn-knife-candy-pipe 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 donate-defense-faith-acoustic 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 9099383761427963870 /3 angle-shock-crush-duck 4 StringLeaf 243 /4 whip-ignore-path-auction
node1 2m 59.822s 2025-09-26 03:14:02.081 2972 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 244
node4 2m 59.822s 2025-09-26 03:14:02.081 2924 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+11+18.869647062Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 59.822s 2025-09-26 03:14:02.081 2925 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 217 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+11+18.869647062Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 59.823s 2025-09-26 03:14:02.082 2926 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 2m 59.824s 2025-09-26 03:14:02.083 2973 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 244 Timestamp: 2025-09-26T03:14:00.252134Z Next consensus number: 6632 Legacy running event hash: 44af34a3649465d7156413857917fb34ef26eca3130d3a1504478dcd0f586b133af0388ccc3ba35fab23430d3ac732e2 Legacy running event mnemonic: nominee-nerve-myth-road Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1804150919 Root hash: a0aa9423aef6f2d6eb3120b7ebf7c335fe3348d84fe5caf0aa4d424e39e382a935cf2f2fa36d52554a9fdf9608798f83 (root) ConsistencyTestingToolState / turn-knife-candy-pipe 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 donate-defense-faith-acoustic 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 9099383761427963870 /3 angle-shock-crush-duck 4 StringLeaf 243 /4 whip-ignore-path-auction
node4 2m 59.828s 2025-09-26 03:14:02.087 2927 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 2m 59.828s 2025-09-26 03:14:02.087 2928 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 244 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/244 {"round":244,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/244/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 2m 59.830s 2025-09-26 03:14:02.089 2974 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+11+18.558338059Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 59.831s 2025-09-26 03:14:02.090 2975 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 217 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+11+18.558338059Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 59.831s 2025-09-26 03:14:02.090 2976 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 2m 59.835s 2025-09-26 03:14:02.094 2977 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 2m 59.836s 2025-09-26 03:14:02.095 2978 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 244 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/244 {"round":244,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/244/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 2m 59.856s 2025-09-26 03:14:02.115 2920 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 244
node0 2m 59.859s 2025-09-26 03:14:02.118 2921 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 244 Timestamp: 2025-09-26T03:14:00.252134Z Next consensus number: 6632 Legacy running event hash: 44af34a3649465d7156413857917fb34ef26eca3130d3a1504478dcd0f586b133af0388ccc3ba35fab23430d3ac732e2 Legacy running event mnemonic: nominee-nerve-myth-road Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1804150919 Root hash: a0aa9423aef6f2d6eb3120b7ebf7c335fe3348d84fe5caf0aa4d424e39e382a935cf2f2fa36d52554a9fdf9608798f83 (root) ConsistencyTestingToolState / turn-knife-candy-pipe 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 donate-defense-faith-acoustic 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 9099383761427963870 /3 angle-shock-crush-duck 4 StringLeaf 243 /4 whip-ignore-path-auction
node0 2m 59.869s 2025-09-26 03:14:02.128 2930 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+11+18.938316448Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 59.869s 2025-09-26 03:14:02.128 2931 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 217 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+11+18.938316448Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 59.870s 2025-09-26 03:14:02.129 2932 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 2m 59.875s 2025-09-26 03:14:02.134 2933 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 2m 59.876s 2025-09-26 03:14:02.135 2934 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 244 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/244 {"round":244,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/244/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 2m 59.896s 2025-09-26 03:14:02.155 2988 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 244
node3 2m 59.898s 2025-09-26 03:14:02.157 2989 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 244 Timestamp: 2025-09-26T03:14:00.252134Z Next consensus number: 6632 Legacy running event hash: 44af34a3649465d7156413857917fb34ef26eca3130d3a1504478dcd0f586b133af0388ccc3ba35fab23430d3ac732e2 Legacy running event mnemonic: nominee-nerve-myth-road Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1804150919 Root hash: a0aa9423aef6f2d6eb3120b7ebf7c335fe3348d84fe5caf0aa4d424e39e382a935cf2f2fa36d52554a9fdf9608798f83 (root) ConsistencyTestingToolState / turn-knife-candy-pipe 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 donate-defense-faith-acoustic 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 9099383761427963870 /3 angle-shock-crush-duck 4 StringLeaf 243 /4 whip-ignore-path-auction
node3 2m 59.905s 2025-09-26 03:14:02.164 2990 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+11+19.110292014Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 59.905s 2025-09-26 03:14:02.164 2991 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 217 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+11+19.110292014Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 59.905s 2025-09-26 03:14:02.164 2992 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 2m 59.910s 2025-09-26 03:14:02.169 2993 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 2m 59.911s 2025-09-26 03:14:02.170 2994 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 244 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/244 {"round":244,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/244/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 3m 12.079s 2025-09-26 03:14:14.338 3187 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 1 to 4>> NetworkUtils: Connection broken: 1 -> 4
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:14:14.336380551Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:14:14.336380551Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readLong(DataInputStream.java:407) at org.hiero.base.io.streams.AugmentedDataInputStream.readLong(AugmentedDataInputStream.java:186) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.deserializeEventWindow(SyncUtils.java:640) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readTheirTipsAndEventWindow$3(SyncUtils.java:104) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more
node3 3m 12.079s 2025-09-26 03:14:14.338 3207 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 3 to 4>> NetworkUtils: Connection broken: 3 -> 4
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.WaitForAcceptReject.transition(WaitForAcceptReject.java:48) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node2 3m 12.080s 2025-09-26 03:14:14.339 3181 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 2 to 4>> NetworkUtils: Connection broken: 2 -> 4
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.heartbeats.HeartbeatPeerProtocol.acknowledgeHeartbeat(HeartbeatPeerProtocol.java:136) at com.swirlds.platform.heartbeats.HeartbeatPeerProtocol.runProtocol(HeartbeatPeerProtocol.java:157) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node0 3m 12.082s 2025-09-26 03:14:14.341 3160 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 0 to 4>> NetworkUtils: Connection broken: 0 -> 4
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:14:14.338516227Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:14:14.338516227Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more
node1 3m 59.504s 2025-09-26 03:15:01.763 3999 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 333 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 3m 59.510s 2025-09-26 03:15:01.769 3973 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 333 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 3m 59.569s 2025-09-26 03:15:01.828 3989 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 333 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 3m 59.609s 2025-09-26 03:15:01.868 3929 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 333 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 3m 59.729s 2025-09-26 03:15:01.988 3932 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 333 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/333
node0 3m 59.729s 2025-09-26 03:15:01.988 3933 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 333
node0 3m 59.831s 2025-09-26 03:15:02.090 3964 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 333
node0 3m 59.834s 2025-09-26 03:15:02.093 3965 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 333 Timestamp: 2025-09-26T03:15:00.421279776Z Next consensus number: 8403 Legacy running event hash: 9aa2447b7b782c5fd05f38220af331668b9f6d263b5e8b05b9936ad0ed61db7e1bf93e2dc60821957fe250344e542b08 Legacy running event mnemonic: profit-month-humor-message Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1751431340 Root hash: abcebc0d7a234b3dc9ea52dac199f2081ee7d0d73e34a05906d7d06abe0af029648af298b2a29cd6454c4d6038c7d32f (root) ConsistencyTestingToolState / main-lazy-dutch-guilt 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 yellow-priority-senior-tunnel 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 1511770382246684861 /3 maid-evoke-poem-fragile 4 StringLeaf 332 /4 polar-pact-teach-path
node0 3m 59.842s 2025-09-26 03:15:02.101 3966 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+11+18.938316448Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 59.842s 2025-09-26 03:15:02.101 3967 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 306 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+11+18.938316448Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 59.843s 2025-09-26 03:15:02.102 3968 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 3m 59.851s 2025-09-26 03:15:02.110 3969 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 3m 59.852s 2025-09-26 03:15:02.111 3970 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 333 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/333 {"round":333,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/333/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 3m 59.961s 2025-09-26 03:15:02.220 3986 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 333 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/333
node3 3m 59.962s 2025-09-26 03:15:02.221 3987 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 333
node2 3m 59.991s 2025-09-26 03:15:02.250 3992 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 333 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/333
node2 3m 59.992s 2025-09-26 03:15:02.251 3993 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 333
node3 4.001m 2025-09-26 03:15:02.315 4030 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 333
node3 4.001m 2025-09-26 03:15:02.317 4031 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 333 Timestamp: 2025-09-26T03:15:00.421279776Z Next consensus number: 8403 Legacy running event hash: 9aa2447b7b782c5fd05f38220af331668b9f6d263b5e8b05b9936ad0ed61db7e1bf93e2dc60821957fe250344e542b08 Legacy running event mnemonic: profit-month-humor-message Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1751431340 Root hash: abcebc0d7a234b3dc9ea52dac199f2081ee7d0d73e34a05906d7d06abe0af029648af298b2a29cd6454c4d6038c7d32f (root) ConsistencyTestingToolState / main-lazy-dutch-guilt 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 yellow-priority-senior-tunnel 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 1511770382246684861 /3 maid-evoke-poem-fragile 4 StringLeaf 332 /4 polar-pact-teach-path
node1 4.001m 2025-09-26 03:15:02.318 4012 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 333 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/333
node1 4.001m 2025-09-26 03:15:02.318 4013 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 333
node3 4.001m 2025-09-26 03:15:02.323 4032 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+11+19.110292014Z_seq0_minr1_maxr501_orgn0.pces
node3 4.001m 2025-09-26 03:15:02.324 4033 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 306 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+11+19.110292014Z_seq0_minr1_maxr501_orgn0.pces
node3 4.001m 2025-09-26 03:15:02.324 4034 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 4.001m 2025-09-26 03:15:02.330 4035 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 4.001m 2025-09-26 03:15:02.330 4036 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 333 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/333 {"round":333,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/333/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 4.001m 2025-09-26 03:15:02.337 4028 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 333
node2 4.001m 2025-09-26 03:15:02.339 4029 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 333 Timestamp: 2025-09-26T03:15:00.421279776Z Next consensus number: 8403 Legacy running event hash: 9aa2447b7b782c5fd05f38220af331668b9f6d263b5e8b05b9936ad0ed61db7e1bf93e2dc60821957fe250344e542b08 Legacy running event mnemonic: profit-month-humor-message Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1751431340 Root hash: abcebc0d7a234b3dc9ea52dac199f2081ee7d0d73e34a05906d7d06abe0af029648af298b2a29cd6454c4d6038c7d32f (root) ConsistencyTestingToolState / main-lazy-dutch-guilt 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 yellow-priority-senior-tunnel 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 1511770382246684861 /3 maid-evoke-poem-fragile 4 StringLeaf 332 /4 polar-pact-teach-path
node2 4.001m 2025-09-26 03:15:02.347 4030 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+11+18.596726997Z_seq0_minr1_maxr501_orgn0.pces
node2 4.001m 2025-09-26 03:15:02.347 4031 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 306 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+11+18.596726997Z_seq0_minr1_maxr501_orgn0.pces
node2 4.001m 2025-09-26 03:15:02.347 4032 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 4.002m 2025-09-26 03:15:02.353 4033 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 4.002m 2025-09-26 03:15:02.354 4034 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 333 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/333 {"round":333,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/333/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 4.002m 2025-09-26 03:15:02.398 4052 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 333
node1 4.002m 2025-09-26 03:15:02.400 4053 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 333 Timestamp: 2025-09-26T03:15:00.421279776Z Next consensus number: 8403 Legacy running event hash: 9aa2447b7b782c5fd05f38220af331668b9f6d263b5e8b05b9936ad0ed61db7e1bf93e2dc60821957fe250344e542b08 Legacy running event mnemonic: profit-month-humor-message Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1751431340 Root hash: abcebc0d7a234b3dc9ea52dac199f2081ee7d0d73e34a05906d7d06abe0af029648af298b2a29cd6454c4d6038c7d32f (root) ConsistencyTestingToolState / main-lazy-dutch-guilt 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 yellow-priority-senior-tunnel 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 1511770382246684861 /3 maid-evoke-poem-fragile 4 StringLeaf 332 /4 polar-pact-teach-path
node1 4.002m 2025-09-26 03:15:02.407 4054 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+11+18.558338059Z_seq0_minr1_maxr501_orgn0.pces
node1 4.002m 2025-09-26 03:15:02.407 4055 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 306 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+11+18.558338059Z_seq0_minr1_maxr501_orgn0.pces
node1 4.002m 2025-09-26 03:15:02.407 4056 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 4.003m 2025-09-26 03:15:02.413 4057 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 4.003m 2025-09-26 03:15:02.414 4058 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 333 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/333 {"round":333,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/333/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 4m 58.909s 2025-09-26 03:16:01.168 5116 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 428 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 4m 58.994s 2025-09-26 03:16:01.253 5126 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 428 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 4m 59.108s 2025-09-26 03:16:01.367 5140 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 428 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 4m 59.255s 2025-09-26 03:16:01.514 5068 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 428 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 4m 59.338s 2025-09-26 03:16:01.597 5071 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 428 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/428
node0 4m 59.339s 2025-09-26 03:16:01.598 5072 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 428
node1 4m 59.409s 2025-09-26 03:16:01.668 5139 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 428 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/428
node1 4m 59.410s 2025-09-26 03:16:01.669 5140 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 428
node0 4m 59.428s 2025-09-26 03:16:01.687 5103 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 428
node0 4m 59.431s 2025-09-26 03:16:01.690 5104 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 428 Timestamp: 2025-09-26T03:16:00.010510454Z Next consensus number: 9928 Legacy running event hash: c7fbb21842656a3889cd14876c450089514a085dc746a255fbef109ad77baa33c0b07815de282d574c8cf2fa8e696859 Legacy running event mnemonic: absent-rug-erosion-either Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 462789221 Root hash: 786e1dec1e3bc426c2299889311f96744f795e748c63bc4f0e43e339788ff1d0fa78f3cefa950a15d219c26570a467c4 (root) ConsistencyTestingToolState / assist-rural-shuffle-kitten 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 swap-inmate-frown-satisfy 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 4101605859953644585 /3 maze-boat-sheriff-fish 4 StringLeaf 427 /4 oblige-fork-uphold-segment
node0 4m 59.437s 2025-09-26 03:16:01.696 5105 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+11+18.938316448Z_seq0_minr1_maxr501_orgn0.pces
node0 4m 59.437s 2025-09-26 03:16:01.696 5106 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 401 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+11+18.938316448Z_seq0_minr1_maxr501_orgn0.pces
node0 4m 59.438s 2025-09-26 03:16:01.697 5107 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 4m 59.445s 2025-09-26 03:16:01.704 5108 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 4m 59.446s 2025-09-26 03:16:01.705 5109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 428 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/428 {"round":428,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/428/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 4m 59.447s 2025-09-26 03:16:01.706 5110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2
node1 4m 59.496s 2025-09-26 03:16:01.755 5171 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 428
node1 4m 59.498s 2025-09-26 03:16:01.757 5172 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 428 Timestamp: 2025-09-26T03:16:00.010510454Z Next consensus number: 9928 Legacy running event hash: c7fbb21842656a3889cd14876c450089514a085dc746a255fbef109ad77baa33c0b07815de282d574c8cf2fa8e696859 Legacy running event mnemonic: absent-rug-erosion-either Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 462789221 Root hash: 786e1dec1e3bc426c2299889311f96744f795e748c63bc4f0e43e339788ff1d0fa78f3cefa950a15d219c26570a467c4 (root) ConsistencyTestingToolState / assist-rural-shuffle-kitten 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 swap-inmate-frown-satisfy 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 4101605859953644585 /3 maze-boat-sheriff-fish 4 StringLeaf 427 /4 oblige-fork-uphold-segment
node1 4m 59.504s 2025-09-26 03:16:01.763 5173 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+11+18.558338059Z_seq0_minr1_maxr501_orgn0.pces
node1 4m 59.505s 2025-09-26 03:16:01.764 5174 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 401 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+11+18.558338059Z_seq0_minr1_maxr501_orgn0.pces
node1 4m 59.505s 2025-09-26 03:16:01.764 5175 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 4m 59.506s 2025-09-26 03:16:01.765 5143 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 428 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/428
node3 4m 59.507s 2025-09-26 03:16:01.766 5144 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 428
node1 4m 59.512s 2025-09-26 03:16:01.771 5176 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 4m 59.512s 2025-09-26 03:16:01.771 5177 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 428 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/428 {"round":428,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/428/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 4m 59.514s 2025-09-26 03:16:01.773 5178 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2
node3 4m 59.598s 2025-09-26 03:16:01.857 5175 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 428
node2 4m 59.601s 2025-09-26 03:16:01.860 5129 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 428 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/428
node3 4m 59.601s 2025-09-26 03:16:01.860 5176 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 428 Timestamp: 2025-09-26T03:16:00.010510454Z Next consensus number: 9928 Legacy running event hash: c7fbb21842656a3889cd14876c450089514a085dc746a255fbef109ad77baa33c0b07815de282d574c8cf2fa8e696859 Legacy running event mnemonic: absent-rug-erosion-either Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 462789221 Root hash: 786e1dec1e3bc426c2299889311f96744f795e748c63bc4f0e43e339788ff1d0fa78f3cefa950a15d219c26570a467c4 (root) ConsistencyTestingToolState / assist-rural-shuffle-kitten 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 swap-inmate-frown-satisfy 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 4101605859953644585 /3 maze-boat-sheriff-fish 4 StringLeaf 427 /4 oblige-fork-uphold-segment
node2 4m 59.602s 2025-09-26 03:16:01.861 5130 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 428
node3 4m 59.607s 2025-09-26 03:16:01.866 5177 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+11+19.110292014Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 59.607s 2025-09-26 03:16:01.866 5178 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 401 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+11+19.110292014Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 59.608s 2025-09-26 03:16:01.867 5179 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 4m 59.615s 2025-09-26 03:16:01.874 5180 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 4m 59.616s 2025-09-26 03:16:01.875 5181 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 428 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/428 {"round":428,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/428/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 4m 59.617s 2025-09-26 03:16:01.876 5182 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2
node2 4m 59.688s 2025-09-26 03:16:01.947 5175 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 428
node2 4m 59.690s 2025-09-26 03:16:01.949 5176 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 428 Timestamp: 2025-09-26T03:16:00.010510454Z Next consensus number: 9928 Legacy running event hash: c7fbb21842656a3889cd14876c450089514a085dc746a255fbef109ad77baa33c0b07815de282d574c8cf2fa8e696859 Legacy running event mnemonic: absent-rug-erosion-either Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 462789221 Root hash: 786e1dec1e3bc426c2299889311f96744f795e748c63bc4f0e43e339788ff1d0fa78f3cefa950a15d219c26570a467c4 (root) ConsistencyTestingToolState / assist-rural-shuffle-kitten 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 swap-inmate-frown-satisfy 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 4101605859953644585 /3 maze-boat-sheriff-fish 4 StringLeaf 427 /4 oblige-fork-uphold-segment
node2 4m 59.697s 2025-09-26 03:16:01.956 5177 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+11+18.596726997Z_seq0_minr1_maxr501_orgn0.pces
node2 4m 59.697s 2025-09-26 03:16:01.956 5178 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 401 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+11+18.596726997Z_seq0_minr1_maxr501_orgn0.pces
node2 4m 59.697s 2025-09-26 03:16:01.956 5179 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 4m 59.704s 2025-09-26 03:16:01.963 5180 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 4m 59.705s 2025-09-26 03:16:01.964 5181 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 428 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/428 {"round":428,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/428/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 4m 59.707s 2025-09-26 03:16:01.966 5182 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2
node4 5m 53.876s 2025-09-26 03:16:56.135 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 5m 53.968s 2025-09-26 03:16:56.227 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 5m 53.984s 2025-09-26 03:16:56.243 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 54.100s 2025-09-26 03:16:56.359 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 5m 54.107s 2025-09-26 03:16:56.366 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 5m 54.119s 2025-09-26 03:16:56.378 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 54.555s 2025-09-26 03:16:56.814 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 5m 54.556s 2025-09-26 03:16:56.815 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 5m 55.464s 2025-09-26 03:16:57.723 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 907ms
node4 5m 55.472s 2025-09-26 03:16:57.731 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 5m 55.476s 2025-09-26 03:16:57.735 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 55.517s 2025-09-26 03:16:57.776 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 5m 55.583s 2025-09-26 03:16:57.842 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 5m 55.583s 2025-09-26 03:16:57.842 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 5m 57.634s 2025-09-26 03:16:59.893 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 5m 57.725s 2025-09-26 03:16:59.984 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 57.732s 2025-09-26 03:16:59.991 21 INFO STARTUP <main> StartupStateUtils: The following saved states were found on disk:
- /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/244/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/151/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/61/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2/SignedState.swh
node4 5m 57.732s 2025-09-26 03:16:59.991 22 INFO STARTUP <main> StartupStateUtils: Loading latest state from disk.
node4 5m 57.733s 2025-09-26 03:16:59.992 23 INFO STARTUP <main> StartupStateUtils: Loading signed state from disk: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/244/SignedState.swh
node4 5m 57.737s 2025-09-26 03:16:59.996 24 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 57.741s 2025-09-26 03:17:00.000 25 INFO STATE_TO_DISK <main> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp
node4 5m 57.881s 2025-09-26 03:17:00.140 36 INFO STARTUP <main> StartupStateUtils: Loaded state's hash is the same as when it was saved.
node4 5m 57.884s 2025-09-26 03:17:00.143 37 INFO STARTUP <main> StartupStateUtils: Platform has loaded a saved state {"round":244,"consensusTimestamp":"2025-09-26T03:14:00.252134Z"} [com.swirlds.logging.legacy.payload.SavedStateLoadedPayload]
node4 5m 57.886s 2025-09-26 03:17:00.145 40 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 57.888s 2025-09-26 03:17:00.147 43 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 5m 57.890s 2025-09-26 03:17:00.149 44 INFO STARTUP <main> AddressBookInitializer: Using the loaded state's address book and weight values.
node4 5m 57.896s 2025-09-26 03:17:00.155 45 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 57.898s 2025-09-26 03:17:00.157 46 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 58.929s 2025-09-26 03:17:01.188 47 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26355088] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=244100, randomLong=8791716911098858715, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=64920, randomLong=3035430171485623473, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1106060, data=35, exception=null] OS Health Check Report - Complete (took 1019 ms)
node4 5m 58.958s 2025-09-26 03:17:01.217 48 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 5m 59.051s 2025-09-26 03:17:01.310 49 INFO STARTUP <main> PcesUtilities: Span compaction completed for data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+11+18.869647062Z_seq0_minr1_maxr501_orgn0.pces, new upper bound is 265
node4 5m 59.054s 2025-09-26 03:17:01.313 50 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 5m 59.060s 2025-09-26 03:17:01.319 51 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 5m 59.140s 2025-09-26 03:17:01.399 52 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IocxRg==", "port": 30124 }, { "ipAddressV4": "CoAAXA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "aMaPYA==", "port": 30125 }, { "ipAddressV4": "CoAAXw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjgW3A==", "port": 30126 }, { "ipAddressV4": "CoAAXQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Iogp+A==", "port": 30127 }, { "ipAddressV4": "CoAAXg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Ih0pyA==", "port": 30128 }, { "ipAddressV4": "CoAAYA==", "port": 30128 }] }] }
node4 5m 59.162s 2025-09-26 03:17:01.421 53 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with state long 9099383761427963870.
node4 5m 59.163s 2025-09-26 03:17:01.422 54 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with 243 rounds handled.
node4 5m 59.163s 2025-09-26 03:17:01.422 55 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 5m 59.164s 2025-09-26 03:17:01.423 56 INFO STARTUP <main> TransactionHandlingHistory: Log file found. Parsing previous history
node0 5m 59.258s 2025-09-26 03:17:01.517 6201 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 524 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 5m 59.414s 2025-09-26 03:17:01.673 6287 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 524 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 5m 59.565s 2025-09-26 03:17:01.824 6273 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 524 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 5m 59.611s 2025-09-26 03:17:01.870 6285 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 524 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 5m 59.730s 2025-09-26 03:17:01.989 6288 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 524 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/524
node3 5m 59.731s 2025-09-26 03:17:01.990 6289 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 524
node1 5m 59.740s 2025-09-26 03:17:01.999 6290 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 524 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/524
node1 5m 59.741s 2025-09-26 03:17:02.000 6291 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 524
node1 5m 59.813s 2025-09-26 03:17:02.072 6326 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 524
node1 5m 59.815s 2025-09-26 03:17:02.074 6327 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 524 Timestamp: 2025-09-26T03:17:00.462648Z Next consensus number: 11484 Legacy running event hash: 7cc4ca428990d9aaff19f6a81d490cb71a76ca9f57dc423100854ba800e8ffb376740024ed8fb8ae4f9c5763be06746e Legacy running event mnemonic: obtain-result-sock-skull Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1502291470 Root hash: 0f8f78c5ddd99a766c2a70d36b5a718df04311a709692bf9358deab6187d3fc9c6bfb495e7f2df15cb1e753ba18e4f8e (root) ConsistencyTestingToolState / vehicle-board-eager-rally 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 jazz-page-circle-chronic 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf -4876472304159893378 /3 violin-mammal-pink-avocado 4 StringLeaf 523 /4 festival-ripple-recipe-avocado
node1 5m 59.820s 2025-09-26 03:17:02.079 6328 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+16+46.871123921Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+11+18.558338059Z_seq0_minr1_maxr501_orgn0.pces
node1 5m 59.820s 2025-09-26 03:17:02.079 6329 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus event files meeting specified criteria to copy.
Lower bound: 497 First file to copy: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+11+18.558338059Z_seq0_minr1_maxr501_orgn0.pces Last file to copy: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+16+46.871123921Z_seq1_minr474_maxr5474_orgn0.pces
node1 5m 59.820s 2025-09-26 03:17:02.079 6330 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 2 preconsensus event file(s)
node3 5m 59.820s 2025-09-26 03:17:02.079 6320 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 524
node3 5m 59.822s 2025-09-26 03:17:02.081 6321 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 524 Timestamp: 2025-09-26T03:17:00.462648Z Next consensus number: 11484 Legacy running event hash: 7cc4ca428990d9aaff19f6a81d490cb71a76ca9f57dc423100854ba800e8ffb376740024ed8fb8ae4f9c5763be06746e Legacy running event mnemonic: obtain-result-sock-skull Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1502291470 Root hash: 0f8f78c5ddd99a766c2a70d36b5a718df04311a709692bf9358deab6187d3fc9c6bfb495e7f2df15cb1e753ba18e4f8e (root) ConsistencyTestingToolState / vehicle-board-eager-rally 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 jazz-page-circle-chronic 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf -4876472304159893378 /3 violin-mammal-pink-avocado 4 StringLeaf 523 /4 festival-ripple-recipe-avocado
node3 5m 59.828s 2025-09-26 03:17:02.087 6330 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+16+46.913686598Z_seq1_minr473_maxr5473_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+11+19.110292014Z_seq0_minr1_maxr501_orgn0.pces
node3 5m 59.828s 2025-09-26 03:17:02.087 6331 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus event files meeting specified criteria to copy.
Lower bound: 497 First file to copy: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+11+19.110292014Z_seq0_minr1_maxr501_orgn0.pces Last file to copy: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+16+46.913686598Z_seq1_minr473_maxr5473_orgn0.pces
node3 5m 59.828s 2025-09-26 03:17:02.087 6332 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 2 preconsensus event file(s)
node1 5m 59.829s 2025-09-26 03:17:02.088 6331 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 2 preconsensus event file(s)
node1 5m 59.829s 2025-09-26 03:17:02.088 6332 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 524 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/524 {"round":524,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/524/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 5m 59.831s 2025-09-26 03:17:02.090 6333 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/61
node3 5m 59.837s 2025-09-26 03:17:02.096 6333 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 2 preconsensus event file(s)
node3 5m 59.838s 2025-09-26 03:17:02.097 6334 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 524 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/524 {"round":524,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/524/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 5m 59.839s 2025-09-26 03:17:02.098 6335 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/61
node4 5m 59.939s 2025-09-26 03:17:02.198 57 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 244 Timestamp: 2025-09-26T03:14:00.252134Z Next consensus number: 6632 Legacy running event hash: 44af34a3649465d7156413857917fb34ef26eca3130d3a1504478dcd0f586b133af0388ccc3ba35fab23430d3ac732e2 Legacy running event mnemonic: nominee-nerve-myth-road Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1804150919 Root hash: a0aa9423aef6f2d6eb3120b7ebf7c335fe3348d84fe5caf0aa4d424e39e382a935cf2f2fa36d52554a9fdf9608798f83 (root) ConsistencyTestingToolState / turn-knife-candy-pipe 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 donate-defense-faith-acoustic 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 9099383761427963870 /3 angle-shock-crush-duck 4 StringLeaf 243 /4 whip-ignore-path-auction
node0 6.002m 2025-09-26 03:17:02.364 6223 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 524 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/524
node0 6.002m 2025-09-26 03:17:02.365 6224 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 524
node2 6.003m 2025-09-26 03:17:02.427 6276 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 524 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/524
node2 6.003m 2025-09-26 03:17:02.428 6277 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 524
node4 6.003m 2025-09-26 03:17:02.439 59 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 44af34a3649465d7156413857917fb34ef26eca3130d3a1504478dcd0f586b133af0388ccc3ba35fab23430d3ac732e2
node4 6.003m 2025-09-26 03:17:02.452 60 INFO STARTUP <platformForkJoinThread-4> Shadowgraph: Shadowgraph starting from expiration threshold 217
node0 6.003m 2025-09-26 03:17:02.458 6260 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 524
node0 6.003m 2025-09-26 03:17:02.460 6261 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 524 Timestamp: 2025-09-26T03:17:00.462648Z Next consensus number: 11484 Legacy running event hash: 7cc4ca428990d9aaff19f6a81d490cb71a76ca9f57dc423100854ba800e8ffb376740024ed8fb8ae4f9c5763be06746e Legacy running event mnemonic: obtain-result-sock-skull Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1502291470 Root hash: 0f8f78c5ddd99a766c2a70d36b5a718df04311a709692bf9358deab6187d3fc9c6bfb495e7f2df15cb1e753ba18e4f8e (root) ConsistencyTestingToolState / vehicle-board-eager-rally 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 jazz-page-circle-chronic 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf -4876472304159893378 /3 violin-mammal-pink-avocado 4 StringLeaf 523 /4 festival-ripple-recipe-avocado
node4 6.003m 2025-09-26 03:17:02.460 62 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 6.003m 2025-09-26 03:17:02.461 63 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 6.003m 2025-09-26 03:17:02.463 64 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 6.003m 2025-09-26 03:17:02.466 65 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 6.003m 2025-09-26 03:17:02.467 66 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node0 6.003m 2025-09-26 03:17:02.468 6262 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+16+46.725519717Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+11+18.938316448Z_seq0_minr1_maxr501_orgn0.pces
node0 6.003m 2025-09-26 03:17:02.468 6263 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus event files meeting specified criteria to copy.
Lower bound: 497 First file to copy: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+11+18.938316448Z_seq0_minr1_maxr501_orgn0.pces Last file to copy: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+16+46.725519717Z_seq1_minr474_maxr5474_orgn0.pces
node0 6.003m 2025-09-26 03:17:02.468 6264 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 2 preconsensus event file(s)
node4 6.003m 2025-09-26 03:17:02.468 67 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 6.004m 2025-09-26 03:17:02.471 68 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 217
node4 6.004m 2025-09-26 03:17:02.475 69 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 194.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node0 6.004m 2025-09-26 03:17:02.477 6265 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 2 preconsensus event file(s)
node0 6.004m 2025-09-26 03:17:02.477 6266 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 524 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/524 {"round":524,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/524/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 6.004m 2025-09-26 03:17:02.479 6267 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/61
node2 6.004m 2025-09-26 03:17:02.514 6320 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 524
node2 6.004m 2025-09-26 03:17:02.516 6321 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 524 Timestamp: 2025-09-26T03:17:00.462648Z Next consensus number: 11484 Legacy running event hash: 7cc4ca428990d9aaff19f6a81d490cb71a76ca9f57dc423100854ba800e8ffb376740024ed8fb8ae4f9c5763be06746e Legacy running event mnemonic: obtain-result-sock-skull Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1502291470 Root hash: 0f8f78c5ddd99a766c2a70d36b5a718df04311a709692bf9358deab6187d3fc9c6bfb495e7f2df15cb1e753ba18e4f8e (root) ConsistencyTestingToolState / vehicle-board-eager-rally 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 jazz-page-circle-chronic 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf -4876472304159893378 /3 violin-mammal-pink-avocado 4 StringLeaf 523 /4 festival-ripple-recipe-avocado
node2 6.004m 2025-09-26 03:17:02.521 6322 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+16+46.935417942Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+11+18.596726997Z_seq0_minr1_maxr501_orgn0.pces
node2 6.004m 2025-09-26 03:17:02.522 6323 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus event files meeting specified criteria to copy.
Lower bound: 497 First file to copy: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+11+18.596726997Z_seq0_minr1_maxr501_orgn0.pces Last file to copy: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+16+46.935417942Z_seq1_minr474_maxr5474_orgn0.pces
node2 6.004m 2025-09-26 03:17:02.522 6324 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 2 preconsensus event file(s)
node2 6.005m 2025-09-26 03:17:02.530 6325 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 2 preconsensus event file(s)
node2 6.005m 2025-09-26 03:17:02.531 6326 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 524 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/524 {"round":524,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/524/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 6.005m 2025-09-26 03:17:02.532 6327 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/61
node4 6.007m 2025-09-26 03:17:02.689 70 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:4 H:9fe44906a695 BR:242), num remaining: 4
node4 6.007m 2025-09-26 03:17:02.690 71 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:7857891a68e8 BR:242), num remaining: 3
node4 6.007m 2025-09-26 03:17:02.691 72 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:b1ed939ca058 BR:242), num remaining: 2
node4 6.007m 2025-09-26 03:17:02.692 73 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:c6f4d5cbd894 BR:243), num remaining: 1
node4 6.007m 2025-09-26 03:17:02.692 74 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:2b8134a800d2 BR:242), num remaining: 0
node4 6.009m 2025-09-26 03:17:02.806 130 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 1,286 preconsensus events with max birth round 265. These events contained 3,259 transactions. 20 rounds reached consensus spanning 12.1 seconds of consensus time. The latest round to reach consensus is round 264. Replay took 334.0 milliseconds.
node4 6.009m 2025-09-26 03:17:02.808 131 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 6.009m 2025-09-26 03:17:02.810 132 INFO PLATFORM_STATUS <platformForkJoinThread-2> DefaultStatusStateMachine: Platform spent 332.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 6m 1.384s 2025-09-26 03:17:03.643 251 INFO PLATFORM_STATUS <platformForkJoinThread-8> DefaultStatusStateMachine: Platform spent 831.0 ms in OBSERVING. Now in BEHIND
node4 6m 1.385s 2025-09-26 03:17:03.644 252 INFO RECONNECT <platformForkJoinThread-2> ReconnectController: Starting ReconnectController
node4 6m 1.386s 2025-09-26 03:17:03.645 253 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, stopping gossip
node4 6m 1.517s 2025-09-26 03:17:03.776 254 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, start clearing queues
node4 6m 1.519s 2025-09-26 03:17:03.778 255 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Queues have been cleared
node4 6m 1.520s 2025-09-26 03:17:03.779 256 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: waiting for reconnect connection
node4 6m 1.521s 2025-09-26 03:17:03.780 257 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: acquired reconnect connection
node3 6m 1.627s 2025-09-26 03:17:03.886 6361 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Starting reconnect in the role of the sender {"receiving":false,"nodeId":3,"otherNodeId":4,"round":526} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node3 6m 1.628s 2025-09-26 03:17:03.887 6362 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: The following state will be sent to the learner:
Round: 526 Timestamp: 2025-09-26T03:17:01.831118539Z Next consensus number: 11515 Legacy running event hash: c9f08953e6573cbb92ead46f66d3578c2167241db9791d745f3291bd9916ada65f1fb59f543c564f0364322325be78ab Legacy running event mnemonic: search-stick-cry-number Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1580595110 Root hash: 5c984bacfd5fa518b2f24d30c34ae11d6dcd087d18556e2f0a089433aecb6b40652473f0cf16f8210121d88e3954d4fd (root) ConsistencyTestingToolState / narrow-intact-feature-fringe 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 thrive-gym-coffee-gospel 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 8081638042399162344 /3 soldier-action-swim-torch 4 StringLeaf 525 /4 opera-episode-quantum-summer
node3 6m 1.629s 2025-09-26 03:17:03.888 6363 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Sending signatures from nodes 0, 2, 3 (signing weight = 37500000000/50000000000) for state hash 5c984bacfd5fa518b2f24d30c34ae11d6dcd087d18556e2f0a089433aecb6b40652473f0cf16f8210121d88e3954d4fd
node3 6m 1.629s 2025-09-26 03:17:03.888 6364 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Starting synchronization in the role of the sender.
node3 6m 1.633s 2025-09-26 03:17:03.892 6365 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node3 6m 1.642s 2025-09-26 03:17:03.901 6366 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@645458f2 start run()
node4 6m 1.693s 2025-09-26 03:17:03.952 258 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Starting reconnect in role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":3,"round":264} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node4 6m 1.695s 2025-09-26 03:17:03.954 259 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Receiving signed state signatures
node4 6m 1.700s 2025-09-26 03:17:03.959 260 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Received signatures from nodes 0, 2, 3
node4 6m 1.702s 2025-09-26 03:17:03.961 261 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls receiveTree()
node4 6m 1.703s 2025-09-26 03:17:03.962 262 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronizing tree
node4 6m 1.703s 2025-09-26 03:17:03.962 263 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 6m 1.709s 2025-09-26 03:17:03.968 264 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@251edad1 start run()
node4 6m 1.714s 2025-09-26 03:17:03.973 265 INFO STARTUP <<work group learning-synchronizer: async-input-stream #0>> ConsistencyTestingToolState: New State Constructed.
node3 6m 1.795s 2025-09-26 03:17:04.054 6385 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@645458f2 finish run()
node3 6m 1.796s 2025-09-26 03:17:04.055 6386 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> TeachingSynchronizer: finished sending tree
node3 6m 1.797s 2025-09-26 03:17:04.056 6387 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node3 6m 1.798s 2025-09-26 03:17:04.057 6388 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@34e1f3e6 start run()
node4 6m 1.909s 2025-09-26 03:17:04.168 287 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 6m 1.910s 2025-09-26 03:17:04.169 288 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 6m 1.910s 2025-09-26 03:17:04.169 289 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@251edad1 finish run()
node4 6m 1.911s 2025-09-26 03:17:04.170 290 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 6m 1.911s 2025-09-26 03:17:04.170 291 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 6m 1.915s 2025-09-26 03:17:04.174 292 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@52403e50 start run()
node4 6m 1.975s 2025-09-26 03:17:04.234 293 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): firstLeafPath: 1 -> 1, lastLeafPath: 1 -> 1
node4 6m 1.975s 2025-09-26 03:17:04.234 294 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): done
node4 6m 1.978s 2025-09-26 03:17:04.237 295 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 6m 1.978s 2025-09-26 03:17:04.237 296 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call nodeRemover.allNodesReceived()
node4 6m 1.978s 2025-09-26 03:17:04.237 297 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived()
node4 6m 1.979s 2025-09-26 03:17:04.238 298 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived(): done
node4 6m 1.979s 2025-09-26 03:17:04.238 299 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call root.endLearnerReconnect()
node4 6m 1.979s 2025-09-26 03:17:04.238 300 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call reconnectIterator.close()
node4 6m 1.980s 2025-09-26 03:17:04.239 301 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call setHashPrivate()
node3 6m 2.048s 2025-09-26 03:17:04.307 6392 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@34e1f3e6 finish run()
node3 6m 2.048s 2025-09-26 03:17:04.307 6393 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> TeachingSynchronizer: finished sending tree
node3 6m 2.051s 2025-09-26 03:17:04.310 6396 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Finished synchronization in the role of the sender.
node4 6m 2.135s 2025-09-26 03:17:04.394 311 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call postInit()
node4 6m 2.136s 2025-09-26 03:17:04.395 313 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: endLearnerReconnect() complete
node4 6m 2.136s 2025-09-26 03:17:04.395 314 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: close() complete
node4 6m 2.136s 2025-09-26 03:17:04.395 315 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 6m 2.137s 2025-09-26 03:17:04.396 316 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@52403e50 finish run()
node4 6m 2.137s 2025-09-26 03:17:04.396 317 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 6m 2.138s 2025-09-26 03:17:04.397 318 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronization complete
node4 6m 2.138s 2025-09-26 03:17:04.397 319 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls initialize()
node4 6m 2.138s 2025-09-26 03:17:04.397 320 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initializing tree
node4 6m 2.138s 2025-09-26 03:17:04.397 321 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initialization complete
node4 6m 2.139s 2025-09-26 03:17:04.398 322 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls hash()
node4 6m 2.139s 2025-09-26 03:17:04.398 323 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing tree
node4 6m 2.140s 2025-09-26 03:17:04.399 324 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing complete
node4 6m 2.140s 2025-09-26 03:17:04.399 325 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls logStatistics()
node4 6m 2.143s 2025-09-26 03:17:04.402 326 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: Finished synchronization {"timeInSeconds":0.435,"hashTimeInSeconds":0.001,"initializationTimeInSeconds":0.0,"totalNodes":12,"leafNodes":7,"redundantLeafNodes":4,"internalNodes":5,"redundantInternalNodes":2} [com.swirlds.logging.legacy.payload.SynchronizationCompletePayload]
node4 6m 2.143s 2025-09-26 03:17:04.402 327 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: ReconnectMapMetrics: transfersFromTeacher=12; transfersFromLearner=10; internalHashes=0; internalCleanHashes=0; internalData=0; internalCleanData=0; leafHashes=3; leafCleanHashes=3; leafData=7; leafCleanData=4
node4 6m 2.144s 2025-09-26 03:17:04.403 328 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner is done synchronizing
node4 6m 2.147s 2025-09-26 03:17:04.406 329 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Reconnect data usage report {"dataMegabytes":0.006054878234863281} [com.swirlds.logging.legacy.payload.ReconnectDataUsagePayload]
node4 6m 2.151s 2025-09-26 03:17:04.410 330 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Finished reconnect in the role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":3,"round":526,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6m 2.151s 2025-09-26 03:17:04.410 331 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Information for state received during reconnect:
Round: 526 Timestamp: 2025-09-26T03:17:01.831118539Z Next consensus number: 11515 Legacy running event hash: c9f08953e6573cbb92ead46f66d3578c2167241db9791d745f3291bd9916ada65f1fb59f543c564f0364322325be78ab Legacy running event mnemonic: search-stick-cry-number Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1580595110 Root hash: 5c984bacfd5fa518b2f24d30c34ae11d6dcd087d18556e2f0a089433aecb6b40652473f0cf16f8210121d88e3954d4fd (root) ConsistencyTestingToolState / narrow-intact-feature-fringe 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 thrive-gym-coffee-gospel 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 8081638042399162344 /3 soldier-action-swim-torch 4 StringLeaf 525 /4 opera-episode-quantum-summer
node4 6m 2.152s 2025-09-26 03:17:04.411 333 DEBUG RECONNECT <<reconnect: reconnect-controller>> ReconnectStateLoader: `loadReconnectState` : reloading state
node4 6m 2.153s 2025-09-26 03:17:04.412 334 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with state long 8081638042399162344.
node4 6m 2.153s 2025-09-26 03:17:04.412 335 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with 525 rounds handled.
node4 6m 2.153s 2025-09-26 03:17:04.412 336 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6m 2.153s 2025-09-26 03:17:04.412 337 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Log file found. Parsing previous history
node4 6m 2.177s 2025-09-26 03:17:04.436 342 INFO STATE_TO_DISK <<reconnect: reconnect-controller>> DefaultSavedStateController: Signed state from round 526 created, will eventually be written to disk, for reason: RECONNECT
node4 6m 2.177s 2025-09-26 03:17:04.436 343 INFO PLATFORM_STATUS <platformForkJoinThread-8> DefaultStatusStateMachine: Platform spent 792.0 ms in BEHIND. Now in RECONNECT_COMPLETE
node4 6m 2.178s 2025-09-26 03:17:04.437 344 INFO STARTUP <platformForkJoinThread-4> Shadowgraph: Shadowgraph starting from expiration threshold 499
node4 6m 2.180s 2025-09-26 03:17:04.439 347 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 526 state to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/526
node4 6m 2.181s 2025-09-26 03:17:04.440 348 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/2 for round 526
node4 6m 2.191s 2025-09-26 03:17:04.450 356 INFO EVENT_STREAM <<reconnect: reconnect-controller>> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: c9f08953e6573cbb92ead46f66d3578c2167241db9791d745f3291bd9916ada65f1fb59f543c564f0364322325be78ab
node4 6m 2.192s 2025-09-26 03:17:04.451 359 INFO STARTUP <platformForkJoinThread-6> PcesFileManager: Due to recent operations on this node, the local preconsensus event stream will have a discontinuity. The last file with the old origin round is 2025-09-26T03+11+18.869647062Z_seq0_minr1_maxr265_orgn0.pces. All future files will have an origin round of 526.
node3 6m 2.220s 2025-09-26 03:17:04.479 6397 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Finished reconnect in the role of the sender. {"receiving":false,"nodeId":3,"otherNodeId":4,"round":526,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6m 2.329s 2025-09-26 03:17:04.588 385 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/2 for round 526
node4 6m 2.333s 2025-09-26 03:17:04.592 386 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 526 Timestamp: 2025-09-26T03:17:01.831118539Z Next consensus number: 11515 Legacy running event hash: c9f08953e6573cbb92ead46f66d3578c2167241db9791d745f3291bd9916ada65f1fb59f543c564f0364322325be78ab Legacy running event mnemonic: search-stick-cry-number Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1580595110 Root hash: 5c984bacfd5fa518b2f24d30c34ae11d6dcd087d18556e2f0a089433aecb6b40652473f0cf16f8210121d88e3954d4fd (root) ConsistencyTestingToolState / narrow-intact-feature-fringe 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 thrive-gym-coffee-gospel 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 8081638042399162344 /3 soldier-action-swim-torch 4 StringLeaf 525 /4 opera-episode-quantum-summer
node4 6m 2.376s 2025-09-26 03:17:04.635 387 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+11+18.869647062Z_seq0_minr1_maxr265_orgn0.pces
node4 6m 2.377s 2025-09-26 03:17:04.636 388 WARN STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: No preconsensus event files meeting specified criteria found to copy. Lower bound: 499
node4 6m 2.385s 2025-09-26 03:17:04.644 389 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 526 to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/526 {"round":526,"freezeState":false,"reason":"RECONNECT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/526/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 2.389s 2025-09-26 03:17:04.648 390 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 210.0 ms in RECONNECT_COMPLETE. Now in CHECKING
node4 6m 3.073s 2025-09-26 03:17:05.332 391 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:e79cad8bd931 BR:524), num remaining: 3
node4 6m 3.074s 2025-09-26 03:17:05.333 392 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:52b6d9dabf21 BR:525), num remaining: 2
node4 6m 3.075s 2025-09-26 03:17:05.334 393 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:67ef23b84e5d BR:524), num remaining: 1
node4 6m 3.086s 2025-09-26 03:17:05.345 394 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:996ea569aeba BR:525), num remaining: 0
node4 6m 3.216s 2025-09-26 03:17:05.475 409 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 6m 3.219s 2025-09-26 03:17:05.478 410 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 6m 7.392s 2025-09-26 03:17:09.651 483 INFO PLATFORM_STATUS <platformForkJoinThread-5> DefaultStatusStateMachine: Platform spent 5.0 s in CHECKING. Now in ACTIVE
node2 6m 59.619s 2025-09-26 03:18:01.878 7509 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 631 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 6m 59.659s 2025-09-26 03:18:01.918 1556 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 631 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 6m 59.669s 2025-09-26 03:18:01.928 7469 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 631 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 6m 59.674s 2025-09-26 03:18:01.933 7543 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 631 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 6m 59.740s 2025-09-26 03:18:01.999 7570 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 631 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 7.000m 2025-09-26 03:18:02.285 1559 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 631 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/631
node1 7.000m 2025-09-26 03:18:02.286 7546 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 631 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/631
node4 7.000m 2025-09-26 03:18:02.286 1560 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 631
node1 7.000m 2025-09-26 03:18:02.287 7547 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 631
node3 7.002m 2025-09-26 03:18:02.357 7573 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 631 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/631
node3 7.002m 2025-09-26 03:18:02.357 7574 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/44 for round 631
node1 7.002m 2025-09-26 03:18:02.373 7578 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 631
node1 7.002m 2025-09-26 03:18:02.375 7579 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 631 Timestamp: 2025-09-26T03:18:00.517801Z Next consensus number: 13880 Legacy running event hash: 59fd028468b6754942d89c81e6b0834cf458e79b23a63c3e529009dd6b31f51a5b4ce4fd20ea4317187b0dd3fcb7fa7c Legacy running event mnemonic: distance-explain-bitter-powder Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1284951305 Root hash: 7943322b02117ec52090d52374f1b3bf6c019929ba70c9c6293d34569010620a693f675a2748b1f4b4ad2a9a2250fecc (root) ConsistencyTestingToolState / chronic-earth-drama-suffer 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 essay-actual-time-staff 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 8119594301207269762 /3 absent-parade-mechanic-man 4 StringLeaf 630 /4 return-original-make-engage
node1 7.002m 2025-09-26 03:18:02.384 7580 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+16+46.871123921Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+11+18.558338059Z_seq0_minr1_maxr501_orgn0.pces
node1 7.002m 2025-09-26 03:18:02.384 7581 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 604 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+16+46.871123921Z_seq1_minr474_maxr5474_orgn0.pces
node1 7.002m 2025-09-26 03:18:02.384 7582 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 7.002m 2025-09-26 03:18:02.387 7583 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 7.002m 2025-09-26 03:18:02.387 7584 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 631 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/631 {"round":631,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/631/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 7.002m 2025-09-26 03:18:02.389 7585 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/151
node4 7.002m 2025-09-26 03:18:02.390 1594 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 631
node4 7.002m 2025-09-26 03:18:02.392 1595 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 631 Timestamp: 2025-09-26T03:18:00.517801Z Next consensus number: 13880 Legacy running event hash: 59fd028468b6754942d89c81e6b0834cf458e79b23a63c3e529009dd6b31f51a5b4ce4fd20ea4317187b0dd3fcb7fa7c Legacy running event mnemonic: distance-explain-bitter-powder Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1284951305 Root hash: 7943322b02117ec52090d52374f1b3bf6c019929ba70c9c6293d34569010620a693f675a2748b1f4b4ad2a9a2250fecc (root) ConsistencyTestingToolState / chronic-earth-drama-suffer 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 essay-actual-time-staff 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 8119594301207269762 /3 absent-parade-mechanic-man 4 StringLeaf 630 /4 return-original-make-engage
node4 7.002m 2025-09-26 03:18:02.401 1596 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+17+04.766230575Z_seq1_minr499_maxr999_orgn526.pces Last file: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+11+18.869647062Z_seq0_minr1_maxr265_orgn0.pces
node4 7.002m 2025-09-26 03:18:02.401 1597 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 604 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+17+04.766230575Z_seq1_minr499_maxr999_orgn526.pces
node4 7.002m 2025-09-26 03:18:02.401 1598 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 7.002m 2025-09-26 03:18:02.405 1599 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 7.002m 2025-09-26 03:18:02.405 1600 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 631 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/631 {"round":631,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/631/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 7.002m 2025-09-26 03:18:02.407 1601 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2
node0 7.003m 2025-09-26 03:18:02.428 7472 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 631 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/631
node0 7.003m 2025-09-26 03:18:02.429 7473 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 631
node2 7.003m 2025-09-26 03:18:02.433 7512 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 631 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/631
node2 7.003m 2025-09-26 03:18:02.434 7513 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 631
node3 7.003m 2025-09-26 03:18:02.445 7605 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/44 for round 631
node3 7.003m 2025-09-26 03:18:02.447 7606 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 631 Timestamp: 2025-09-26T03:18:00.517801Z Next consensus number: 13880 Legacy running event hash: 59fd028468b6754942d89c81e6b0834cf458e79b23a63c3e529009dd6b31f51a5b4ce4fd20ea4317187b0dd3fcb7fa7c Legacy running event mnemonic: distance-explain-bitter-powder Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1284951305 Root hash: 7943322b02117ec52090d52374f1b3bf6c019929ba70c9c6293d34569010620a693f675a2748b1f4b4ad2a9a2250fecc (root) ConsistencyTestingToolState / chronic-earth-drama-suffer 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 essay-actual-time-staff 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 8119594301207269762 /3 absent-parade-mechanic-man 4 StringLeaf 630 /4 return-original-make-engage
node3 7.003m 2025-09-26 03:18:02.454 7607 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+16+46.913686598Z_seq1_minr473_maxr5473_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+11+19.110292014Z_seq0_minr1_maxr501_orgn0.pces
node3 7.003m 2025-09-26 03:18:02.454 7608 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 604 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+16+46.913686598Z_seq1_minr473_maxr5473_orgn0.pces
node3 7.003m 2025-09-26 03:18:02.454 7609 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 7.003m 2025-09-26 03:18:02.456 7610 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 7.003m 2025-09-26 03:18:02.457 7611 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 631 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/631 {"round":631,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/631/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 7.003m 2025-09-26 03:18:02.458 7612 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/151
node2 7.004m 2025-09-26 03:18:02.524 7548 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 631
node2 7.004m 2025-09-26 03:18:02.526 7549 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 631 Timestamp: 2025-09-26T03:18:00.517801Z Next consensus number: 13880 Legacy running event hash: 59fd028468b6754942d89c81e6b0834cf458e79b23a63c3e529009dd6b31f51a5b4ce4fd20ea4317187b0dd3fcb7fa7c Legacy running event mnemonic: distance-explain-bitter-powder Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1284951305 Root hash: 7943322b02117ec52090d52374f1b3bf6c019929ba70c9c6293d34569010620a693f675a2748b1f4b4ad2a9a2250fecc (root) ConsistencyTestingToolState / chronic-earth-drama-suffer 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 essay-actual-time-staff 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 8119594301207269762 /3 absent-parade-mechanic-man 4 StringLeaf 630 /4 return-original-make-engage
node0 7.004m 2025-09-26 03:18:02.528 7516 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 631
node0 7.005m 2025-09-26 03:18:02.531 7517 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 631 Timestamp: 2025-09-26T03:18:00.517801Z Next consensus number: 13880 Legacy running event hash: 59fd028468b6754942d89c81e6b0834cf458e79b23a63c3e529009dd6b31f51a5b4ce4fd20ea4317187b0dd3fcb7fa7c Legacy running event mnemonic: distance-explain-bitter-powder Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1284951305 Root hash: 7943322b02117ec52090d52374f1b3bf6c019929ba70c9c6293d34569010620a693f675a2748b1f4b4ad2a9a2250fecc (root) ConsistencyTestingToolState / chronic-earth-drama-suffer 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 essay-actual-time-staff 1 SingletonNode RosterService.ROSTER_STATE /1 library-agent-antique-estate 2 VirtualMap RosterService.ROSTERS /2 box-toast-control-amazing 3 StringLeaf 8119594301207269762 /3 absent-parade-mechanic-man 4 StringLeaf 630 /4 return-original-make-engage
node2 7.005m 2025-09-26 03:18:02.535 7550 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+16+46.935417942Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+11+18.596726997Z_seq0_minr1_maxr501_orgn0.pces
node2 7.005m 2025-09-26 03:18:02.535 7551 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 604 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+16+46.935417942Z_seq1_minr474_maxr5474_orgn0.pces
node2 7.005m 2025-09-26 03:18:02.536 7552 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 7.005m 2025-09-26 03:18:02.538 7553 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 7.005m 2025-09-26 03:18:02.538 7554 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 631 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/631 {"round":631,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/631/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7.005m 2025-09-26 03:18:02.540 7518 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+16+46.725519717Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+11+18.938316448Z_seq0_minr1_maxr501_orgn0.pces
node0 7.005m 2025-09-26 03:18:02.540 7519 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 604 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+16+46.725519717Z_seq1_minr474_maxr5474_orgn0.pces
node2 7.005m 2025-09-26 03:18:02.540 7555 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/151
node0 7.005m 2025-09-26 03:18:02.541 7520 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 7.005m 2025-09-26 03:18:02.543 7521 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 7.005m 2025-09-26 03:18:02.543 7522 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 631 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/631 {"round":631,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/631/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7.005m 2025-09-26 03:18:02.545 7523 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/151
node0 7m 55.791s 2025-09-26 03:18:58.050 8460 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 0 to 1>> NetworkUtils: Connection broken: 0 -> 1
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:18:58.050308105Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:18:58.050308105Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:180) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readBoolean(DataInputStream.java:255) at org.hiero.base.io.streams.AugmentedDataInputStream.readBoolean(AugmentedDataInputStream.java:137) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readMyTipsTheyHave$7(SyncUtils.java:163) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more
node3 7m 55.792s 2025-09-26 03:18:58.051 8569 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 3 to 1>> NetworkUtils: Connection broken: 3 <- 1
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:18:58.050152571Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:18:58.050152571Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:180) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readBoolean(DataInputStream.java:255) at org.hiero.base.io.streams.AugmentedDataInputStream.readBoolean(AugmentedDataInputStream.java:137) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readMyTipsTheyHave$7(SyncUtils.java:163) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more
node4 7m 55.795s 2025-09-26 03:18:58.054 2564 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 4 to 1>> NetworkUtils: Connection broken: 4 <- 1
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:18:58.052580320Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:18:58.052580320Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more
node0 7m 55.936s 2025-09-26 03:18:58.195 8461 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith2 0 to 2>> NetworkUtils: Connection broken: 0 -> 2
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:18:58.195591289Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:18:58.195591289Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more
node3 7m 55.936s 2025-09-26 03:18:58.195 8578 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith2 3 to 2>> NetworkUtils: Connection broken: 3 <- 2
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:18:58.195578396Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:18:58.195578396Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 12 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$sendEventsTheyNeed$8(SyncUtils.java:234) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more
node4 7m 55.936s 2025-09-26 03:18:58.195 2573 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith2 4 to 2>> NetworkUtils: Connection broken: 4 <- 2
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:18:58.195439843Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:18:58.195439843Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 12 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$sendEventsTheyNeed$8(SyncUtils.java:234) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more
node0 7m 56.258s 2025-09-26 03:18:58.517 8470 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith3 0 to 3>> NetworkUtils: Connection broken: 0 -> 3
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:18:58.516903063Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:18:58.516903063Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more
node4 7m 56.258s 2025-09-26 03:18:58.517 2574 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith3 4 to 3>> NetworkUtils: Connection broken: 4 <- 3
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:18:58.516846546Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T03:18:58.516846546Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readLong(DataInputStream.java:407) at org.hiero.base.io.streams.AugmentedDataInputStream.readLong(AugmentedDataInputStream.java:186) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.deserializeEventWindow(SyncUtils.java:640) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readTheirTipsAndEventWindow$3(SyncUtils.java:104) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more