Node ID







Columns











Log Level





Log Marker








Class



















































node1 0.000ns 2025-09-24 20:25:00.145 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node1 82.000ms 2025-09-24 20:25:00.227 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node1 97.000ms 2025-09-24 20:25:00.242 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 109.000ms 2025-09-24 20:25:00.254 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node0 194.000ms 2025-09-24 20:25:00.339 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node1 204.000ms 2025-09-24 20:25:00.349 4 INFO STARTUP <main> Browser: The following nodes [1] are set to run locally
node0 209.000ms 2025-09-24 20:25:00.354 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 210.000ms 2025-09-24 20:25:00.355 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node1 222.000ms 2025-09-24 20:25:00.367 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 323.000ms 2025-09-24 20:25:00.468 4 INFO STARTUP <main> Browser: The following nodes [0] are set to run locally
node0 330.000ms 2025-09-24 20:25:00.475 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node0 343.000ms 2025-09-24 20:25:00.488 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 628.000ms 2025-09-24 20:25:00.773 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node1 628.000ms 2025-09-24 20:25:00.773 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node0 772.000ms 2025-09-24 20:25:00.917 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node0 773.000ms 2025-09-24 20:25:00.918 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node3 892.000ms 2025-09-24 20:25:01.037 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 909.000ms 2025-09-24 20:25:01.054 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node3 977.000ms 2025-09-24 20:25:01.122 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node3 992.000ms 2025-09-24 20:25:01.137 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 996.000ms 2025-09-24 20:25:01.141 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 1.011s 2025-09-24 20:25:01.156 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 1.100s 2025-09-24 20:25:01.245 4 INFO STARTUP <main> Browser: The following nodes [3] are set to run locally
node3 1.106s 2025-09-24 20:25:01.251 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node3 1.118s 2025-09-24 20:25:01.263 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 1.122s 2025-09-24 20:25:01.267 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 1.129s 2025-09-24 20:25:01.274 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 1.141s 2025-09-24 20:25:01.286 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 1.481s 2025-09-24 20:25:01.626 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 852ms
node1 1.489s 2025-09-24 20:25:01.634 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node1 1.492s 2025-09-24 20:25:01.637 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 1.528s 2025-09-24 20:25:01.673 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node3 1.528s 2025-09-24 20:25:01.673 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node1 1.531s 2025-09-24 20:25:01.676 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 1.558s 2025-09-24 20:25:01.703 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 1.559s 2025-09-24 20:25:01.704 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node0 1.576s 2025-09-24 20:25:01.721 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 803ms
node0 1.584s 2025-09-24 20:25:01.729 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node0 1.587s 2025-09-24 20:25:01.732 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 1.608s 2025-09-24 20:25:01.753 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node1 1.609s 2025-09-24 20:25:01.754 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node0 1.625s 2025-09-24 20:25:01.770 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node0 1.694s 2025-09-24 20:25:01.839 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node0 1.695s 2025-09-24 20:25:01.840 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node2 2.125s 2025-09-24 20:25:02.270 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node2 2.217s 2025-09-24 20:25:02.362 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node2 2.233s 2025-09-24 20:25:02.378 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 2.349s 2025-09-24 20:25:02.494 4 INFO STARTUP <main> Browser: The following nodes [2] are set to run locally
node2 2.356s 2025-09-24 20:25:02.501 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node2 2.368s 2025-09-24 20:25:02.513 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 2.396s 2025-09-24 20:25:02.541 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 867ms
node3 2.404s 2025-09-24 20:25:02.549 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node3 2.407s 2025-09-24 20:25:02.552 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 2.443s 2025-09-24 20:25:02.588 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node3 2.502s 2025-09-24 20:25:02.647 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node3 2.502s 2025-09-24 20:25:02.647 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 2.505s 2025-09-24 20:25:02.650 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 946ms
node4 2.513s 2025-09-24 20:25:02.658 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 2.516s 2025-09-24 20:25:02.661 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 2.553s 2025-09-24 20:25:02.698 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 2.610s 2025-09-24 20:25:02.755 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 2.611s 2025-09-24 20:25:02.756 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node2 2.821s 2025-09-24 20:25:02.966 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node2 2.822s 2025-09-24 20:25:02.967 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node1 3.638s 2025-09-24 20:25:03.783 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node0 3.684s 2025-09-24 20:25:03.829 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node1 3.718s 2025-09-24 20:25:03.863 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 3.720s 2025-09-24 20:25:03.865 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node1 3.720s 2025-09-24 20:25:03.865 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 3.760s 2025-09-24 20:25:03.905 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 3.762s 2025-09-24 20:25:03.907 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node0 3.763s 2025-09-24 20:25:03.908 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 3.800s 2025-09-24 20:25:03.945 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 977ms
node2 3.810s 2025-09-24 20:25:03.955 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node2 3.814s 2025-09-24 20:25:03.959 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 3.865s 2025-09-24 20:25:04.010 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node2 3.930s 2025-09-24 20:25:04.075 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node2 3.931s 2025-09-24 20:25:04.076 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node1 4.461s 2025-09-24 20:25:04.606 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.464s 2025-09-24 20:25:04.609 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node1 4.468s 2025-09-24 20:25:04.613 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node1 4.477s 2025-09-24 20:25:04.622 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.479s 2025-09-24 20:25:04.624 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 4.503s 2025-09-24 20:25:04.648 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 4.506s 2025-09-24 20:25:04.651 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node0 4.512s 2025-09-24 20:25:04.657 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node0 4.523s 2025-09-24 20:25:04.668 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 4.524s 2025-09-24 20:25:04.669 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 4.538s 2025-09-24 20:25:04.683 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node3 4.618s 2025-09-24 20:25:04.763 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 4.620s 2025-09-24 20:25:04.765 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node3 4.621s 2025-09-24 20:25:04.766 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 4.630s 2025-09-24 20:25:04.775 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 4.713s 2025-09-24 20:25:04.858 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.715s 2025-09-24 20:25:04.860 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node4 4.716s 2025-09-24 20:25:04.861 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 5.393s 2025-09-24 20:25:05.538 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 5.395s 2025-09-24 20:25:05.540 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node3 5.401s 2025-09-24 20:25:05.546 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node3 5.413s 2025-09-24 20:25:05.558 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 5.415s 2025-09-24 20:25:05.560 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.486s 2025-09-24 20:25:05.631 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.489s 2025-09-24 20:25:05.634 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 5.495s 2025-09-24 20:25:05.640 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node4 5.504s 2025-09-24 20:25:05.649 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.506s 2025-09-24 20:25:05.651 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 5.582s 2025-09-24 20:25:05.727 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26704426] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=178440, randomLong=-5293696194344587924, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=9080, randomLong=5281659517305610814, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1337341, data=35, exception=null] OS Health Check Report - Complete (took 1021 ms)
node1 5.612s 2025-09-24 20:25:05.757 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node1 5.619s 2025-09-24 20:25:05.764 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node1 5.624s 2025-09-24 20:25:05.769 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node0 5.626s 2025-09-24 20:25:05.771 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26598751] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=254190, randomLong=9002038940534725311, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10380, randomLong=-8106905576027108485, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1307981, data=35, exception=null] OS Health Check Report - Complete (took 1022 ms)
node0 5.657s 2025-09-24 20:25:05.802 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node0 5.665s 2025-09-24 20:25:05.810 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node0 5.671s 2025-09-24 20:25:05.816 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node1 5.697s 2025-09-24 20:25:05.842 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ijy81A==", "port": 30124 }, { "ipAddressV4": "CoAAIg==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IhwXsQ==", "port": 30125 }, { "ipAddressV4": "CoAAFA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "Ih13kg==", "port": 30126 }, { "ipAddressV4": "CoAAKQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Iq1CrQ==", "port": 30127 }, { "ipAddressV4": "CoAAGA==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I7gDDA==", "port": 30128 }, { "ipAddressV4": "CoAAHA==", "port": 30128 }] }] }
node1 5.716s 2025-09-24 20:25:05.861 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/1/ConsistencyTestLog.csv
node1 5.717s 2025-09-24 20:25:05.862 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node1 5.730s 2025-09-24 20:25:05.875 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 21539c8bd2cdad1d0401b23f119ef3194632d5847860e72748b11f1803453d861b51c8854691c7bb7029b6ac1501c94f (root) ConsistencyTestingToolState / clerk-huge-please-fragile 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot
node0 5.749s 2025-09-24 20:25:05.894 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ijy81A==", "port": 30124 }, { "ipAddressV4": "CoAAIg==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IhwXsQ==", "port": 30125 }, { "ipAddressV4": "CoAAFA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "Ih13kg==", "port": 30126 }, { "ipAddressV4": "CoAAKQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Iq1CrQ==", "port": 30127 }, { "ipAddressV4": "CoAAGA==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I7gDDA==", "port": 30128 }, { "ipAddressV4": "CoAAHA==", "port": 30128 }] }] }
node0 5.770s 2025-09-24 20:25:05.915 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/0/ConsistencyTestLog.csv
node0 5.770s 2025-09-24 20:25:05.915 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node0 5.785s 2025-09-24 20:25:05.930 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 21539c8bd2cdad1d0401b23f119ef3194632d5847860e72748b11f1803453d861b51c8854691c7bb7029b6ac1501c94f (root) ConsistencyTestingToolState / clerk-huge-please-fragile 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot
node1 5.944s 2025-09-24 20:25:06.089 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node1 5.949s 2025-09-24 20:25:06.094 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node1 5.954s 2025-09-24 20:25:06.099 47 INFO STARTUP <<start-node-1>> ConsistencyTestingToolMain: init called in Main for node 1.
node1 5.954s 2025-09-24 20:25:06.099 48 INFO STARTUP <<start-node-1>> SwirldsPlatform: Starting platform 1
node1 5.955s 2025-09-24 20:25:06.100 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node1 5.958s 2025-09-24 20:25:06.103 50 INFO STARTUP <<start-node-1>> CycleFinder: No cyclical back pressure detected in wiring model.
node1 5.959s 2025-09-24 20:25:06.104 51 INFO STARTUP <<start-node-1>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node1 5.960s 2025-09-24 20:25:06.105 52 INFO STARTUP <<start-node-1>> InputWireChecks: All input wires have been bound.
node1 5.961s 2025-09-24 20:25:06.106 53 WARN STARTUP <<start-node-1>> PcesFileTracker: No preconsensus event files available
node1 5.961s 2025-09-24 20:25:06.106 54 INFO STARTUP <<start-node-1>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node1 5.963s 2025-09-24 20:25:06.108 55 INFO STARTUP <<start-node-1>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node1 5.963s 2025-09-24 20:25:06.108 56 INFO STARTUP <<app: appMain 1>> ConsistencyTestingToolMain: run called in Main.
node1 5.965s 2025-09-24 20:25:06.110 57 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 181.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node1 5.969s 2025-09-24 20:25:06.114 58 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node2 5.997s 2025-09-24 20:25:06.142 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node0 6.013s 2025-09-24 20:25:06.158 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node0 6.017s 2025-09-24 20:25:06.162 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node0 6.021s 2025-09-24 20:25:06.166 47 INFO STARTUP <<start-node-0>> ConsistencyTestingToolMain: init called in Main for node 0.
node0 6.022s 2025-09-24 20:25:06.167 48 INFO STARTUP <<start-node-0>> SwirldsPlatform: Starting platform 0
node0 6.023s 2025-09-24 20:25:06.168 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node0 6.026s 2025-09-24 20:25:06.171 50 INFO STARTUP <<start-node-0>> CycleFinder: No cyclical back pressure detected in wiring model.
node0 6.027s 2025-09-24 20:25:06.172 51 INFO STARTUP <<start-node-0>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node0 6.027s 2025-09-24 20:25:06.172 52 INFO STARTUP <<start-node-0>> InputWireChecks: All input wires have been bound.
node0 6.029s 2025-09-24 20:25:06.174 53 WARN STARTUP <<start-node-0>> PcesFileTracker: No preconsensus event files available
node0 6.029s 2025-09-24 20:25:06.174 54 INFO STARTUP <<start-node-0>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node0 6.030s 2025-09-24 20:25:06.175 55 INFO STARTUP <<start-node-0>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node0 6.032s 2025-09-24 20:25:06.177 56 INFO STARTUP <<app: appMain 0>> ConsistencyTestingToolMain: run called in Main.
node0 6.032s 2025-09-24 20:25:06.177 57 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 191.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node0 6.037s 2025-09-24 20:25:06.182 58 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node2 6.090s 2025-09-24 20:25:06.235 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 6.092s 2025-09-24 20:25:06.237 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node2 6.093s 2025-09-24 20:25:06.238 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 6.524s 2025-09-24 20:25:06.669 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26273161] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=146900, randomLong=557527347953404453, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=11530, randomLong=-8925152715390996769, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1055960, data=35, exception=null] OS Health Check Report - Complete (took 1020 ms)
node3 6.554s 2025-09-24 20:25:06.699 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node3 6.561s 2025-09-24 20:25:06.706 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node3 6.566s 2025-09-24 20:25:06.711 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 6.613s 2025-09-24 20:25:06.758 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26194555] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=209430, randomLong=2799334471986580895, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10209, randomLong=1370839255565847635, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1295981, data=35, exception=null] OS Health Check Report - Complete (took 1023 ms)
node3 6.642s 2025-09-24 20:25:06.787 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ijy81A==", "port": 30124 }, { "ipAddressV4": "CoAAIg==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IhwXsQ==", "port": 30125 }, { "ipAddressV4": "CoAAFA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "Ih13kg==", "port": 30126 }, { "ipAddressV4": "CoAAKQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Iq1CrQ==", "port": 30127 }, { "ipAddressV4": "CoAAGA==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I7gDDA==", "port": 30128 }, { "ipAddressV4": "CoAAHA==", "port": 30128 }] }] }
node4 6.644s 2025-09-24 20:25:06.789 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 6.652s 2025-09-24 20:25:06.797 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 6.657s 2025-09-24 20:25:06.802 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node3 6.662s 2025-09-24 20:25:06.807 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/3/ConsistencyTestLog.csv
node3 6.662s 2025-09-24 20:25:06.807 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node3 6.676s 2025-09-24 20:25:06.821 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 21539c8bd2cdad1d0401b23f119ef3194632d5847860e72748b11f1803453d861b51c8854691c7bb7029b6ac1501c94f (root) ConsistencyTestingToolState / clerk-huge-please-fragile 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot
node4 6.737s 2025-09-24 20:25:06.882 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ijy81A==", "port": 30124 }, { "ipAddressV4": "CoAAIg==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IhwXsQ==", "port": 30125 }, { "ipAddressV4": "CoAAFA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "Ih13kg==", "port": 30126 }, { "ipAddressV4": "CoAAKQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Iq1CrQ==", "port": 30127 }, { "ipAddressV4": "CoAAGA==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I7gDDA==", "port": 30128 }, { "ipAddressV4": "CoAAHA==", "port": 30128 }] }] }
node4 6.757s 2025-09-24 20:25:06.902 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6.758s 2025-09-24 20:25:06.903 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node4 6.772s 2025-09-24 20:25:06.917 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 21539c8bd2cdad1d0401b23f119ef3194632d5847860e72748b11f1803453d861b51c8854691c7bb7029b6ac1501c94f (root) ConsistencyTestingToolState / clerk-huge-please-fragile 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot
node3 6.881s 2025-09-24 20:25:07.026 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node3 6.885s 2025-09-24 20:25:07.030 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node2 6.886s 2025-09-24 20:25:07.031 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 6.889s 2025-09-24 20:25:07.034 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node3 6.890s 2025-09-24 20:25:07.035 47 INFO STARTUP <<start-node-3>> ConsistencyTestingToolMain: init called in Main for node 3.
node3 6.890s 2025-09-24 20:25:07.035 48 INFO STARTUP <<start-node-3>> SwirldsPlatform: Starting platform 3
node3 6.891s 2025-09-24 20:25:07.036 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node3 6.895s 2025-09-24 20:25:07.040 50 INFO STARTUP <<start-node-3>> CycleFinder: No cyclical back pressure detected in wiring model.
node3 6.896s 2025-09-24 20:25:07.041 51 INFO STARTUP <<start-node-3>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node2 6.897s 2025-09-24 20:25:07.042 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node3 6.897s 2025-09-24 20:25:07.042 52 INFO STARTUP <<start-node-3>> InputWireChecks: All input wires have been bound.
node3 6.898s 2025-09-24 20:25:07.043 53 WARN STARTUP <<start-node-3>> PcesFileTracker: No preconsensus event files available
node3 6.898s 2025-09-24 20:25:07.043 54 INFO STARTUP <<start-node-3>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node3 6.900s 2025-09-24 20:25:07.045 55 INFO STARTUP <<start-node-3>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node3 6.901s 2025-09-24 20:25:07.046 56 INFO STARTUP <<app: appMain 3>> ConsistencyTestingToolMain: run called in Main.
node3 6.902s 2025-09-24 20:25:07.047 57 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 172.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node3 6.908s 2025-09-24 20:25:07.053 58 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node2 6.911s 2025-09-24 20:25:07.056 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 6.913s 2025-09-24 20:25:07.058 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 6.949s 2025-09-24 20:25:07.094 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node4 6.954s 2025-09-24 20:25:07.099 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node4 6.958s 2025-09-24 20:25:07.103 47 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 6.958s 2025-09-24 20:25:07.103 48 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 6.959s 2025-09-24 20:25:07.104 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 6.962s 2025-09-24 20:25:07.107 50 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 6.963s 2025-09-24 20:25:07.108 51 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 6.964s 2025-09-24 20:25:07.109 52 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 6.965s 2025-09-24 20:25:07.110 53 WARN STARTUP <<start-node-4>> PcesFileTracker: No preconsensus event files available
node4 6.966s 2025-09-24 20:25:07.111 54 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node4 6.967s 2025-09-24 20:25:07.112 55 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node4 6.968s 2025-09-24 20:25:07.113 56 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 6.969s 2025-09-24 20:25:07.114 57 INFO PLATFORM_STATUS <platformForkJoinThread-2> DefaultStatusStateMachine: Platform spent 141.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 6.974s 2025-09-24 20:25:07.119 58 INFO PLATFORM_STATUS <platformForkJoinThread-2> DefaultStatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node2 8.035s 2025-09-24 20:25:08.180 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26293789] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=213360, randomLong=-356917788469465257, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=14190, randomLong=-5911566235576758630, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1306909, data=35, exception=null] OS Health Check Report - Complete (took 1021 ms)
node2 8.066s 2025-09-24 20:25:08.211 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node2 8.074s 2025-09-24 20:25:08.219 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node2 8.079s 2025-09-24 20:25:08.224 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node2 8.158s 2025-09-24 20:25:08.303 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ijy81A==", "port": 30124 }, { "ipAddressV4": "CoAAIg==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IhwXsQ==", "port": 30125 }, { "ipAddressV4": "CoAAFA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "Ih13kg==", "port": 30126 }, { "ipAddressV4": "CoAAKQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Iq1CrQ==", "port": 30127 }, { "ipAddressV4": "CoAAGA==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I7gDDA==", "port": 30128 }, { "ipAddressV4": "CoAAHA==", "port": 30128 }] }] }
node2 8.180s 2025-09-24 20:25:08.325 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/2/ConsistencyTestLog.csv
node2 8.181s 2025-09-24 20:25:08.326 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node2 8.197s 2025-09-24 20:25:08.342 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 21539c8bd2cdad1d0401b23f119ef3194632d5847860e72748b11f1803453d861b51c8854691c7bb7029b6ac1501c94f (root) ConsistencyTestingToolState / clerk-huge-please-fragile 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot
node2 8.405s 2025-09-24 20:25:08.550 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node2 8.409s 2025-09-24 20:25:08.554 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node2 8.414s 2025-09-24 20:25:08.559 47 INFO STARTUP <<start-node-2>> ConsistencyTestingToolMain: init called in Main for node 2.
node2 8.415s 2025-09-24 20:25:08.560 48 INFO STARTUP <<start-node-2>> SwirldsPlatform: Starting platform 2
node2 8.416s 2025-09-24 20:25:08.561 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node2 8.420s 2025-09-24 20:25:08.565 50 INFO STARTUP <<start-node-2>> CycleFinder: No cyclical back pressure detected in wiring model.
node2 8.421s 2025-09-24 20:25:08.566 51 INFO STARTUP <<start-node-2>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node2 8.421s 2025-09-24 20:25:08.566 52 INFO STARTUP <<start-node-2>> InputWireChecks: All input wires have been bound.
node2 8.423s 2025-09-24 20:25:08.568 53 WARN STARTUP <<start-node-2>> PcesFileTracker: No preconsensus event files available
node2 8.423s 2025-09-24 20:25:08.568 54 INFO STARTUP <<start-node-2>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node2 8.425s 2025-09-24 20:25:08.570 55 INFO STARTUP <<start-node-2>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node2 8.426s 2025-09-24 20:25:08.571 56 INFO STARTUP <<app: appMain 2>> ConsistencyTestingToolMain: run called in Main.
node2 8.427s 2025-09-24 20:25:08.572 57 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 173.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node2 8.432s 2025-09-24 20:25:08.577 58 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node1 8.964s 2025-09-24 20:25:09.109 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting1.csv' ]
node1 8.967s 2025-09-24 20:25:09.112 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node0 9.032s 2025-09-24 20:25:09.177 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting0.csv' ]
node0 9.034s 2025-09-24 20:25:09.179 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node3 9.904s 2025-09-24 20:25:10.049 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting3.csv' ]
node3 9.907s 2025-09-24 20:25:10.052 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 9.969s 2025-09-24 20:25:10.114 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 9.971s 2025-09-24 20:25:10.116 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node2 11.425s 2025-09-24 20:25:11.570 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting2.csv' ]
node2 11.427s 2025-09-24 20:25:11.572 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node1 16.061s 2025-09-24 20:25:16.206 61 INFO PLATFORM_STATUS <platformForkJoinThread-7> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node0 16.129s 2025-09-24 20:25:16.274 61 INFO PLATFORM_STATUS <platformForkJoinThread-2> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node3 16.997s 2025-09-24 20:25:17.142 61 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node4 17.066s 2025-09-24 20:25:17.211 61 INFO PLATFORM_STATUS <platformForkJoinThread-2> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node1 18.434s 2025-09-24 20:25:18.579 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node3 18.481s 2025-09-24 20:25:18.626 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node2 18.523s 2025-09-24 20:25:18.668 61 INFO PLATFORM_STATUS <platformForkJoinThread-7> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node0 18.561s 2025-09-24 20:25:18.706 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node2 18.616s 2025-09-24 20:25:18.761 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node4 18.617s 2025-09-24 20:25:18.762 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node3 18.899s 2025-09-24 20:25:19.044 63 INFO PLATFORM_STATUS <platformForkJoinThread-7> DefaultStatusStateMachine: Platform spent 1.9 s in CHECKING. Now in ACTIVE
node3 18.902s 2025-09-24 20:25:19.047 65 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node0 18.984s 2025-09-24 20:25:19.129 63 INFO PLATFORM_STATUS <platformForkJoinThread-8> DefaultStatusStateMachine: Platform spent 2.9 s in CHECKING. Now in ACTIVE
node0 18.987s 2025-09-24 20:25:19.132 65 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node2 19.039s 2025-09-24 20:25:19.184 64 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node4 19.115s 2025-09-24 20:25:19.260 63 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 2.0 s in CHECKING. Now in ACTIVE
node4 19.118s 2025-09-24 20:25:19.263 65 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node1 19.131s 2025-09-24 20:25:19.276 63 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 3.1 s in CHECKING. Now in ACTIVE
node1 19.134s 2025-09-24 20:25:19.279 65 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node1 19.383s 2025-09-24 20:25:19.528 80 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2
node1 19.384s 2025-09-24 20:25:19.529 81 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node4 19.401s 2025-09-24 20:25:19.546 80 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2
node4 19.403s 2025-09-24 20:25:19.548 81 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node0 19.461s 2025-09-24 20:25:19.606 80 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2
node0 19.463s 2025-09-24 20:25:19.608 81 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node3 19.476s 2025-09-24 20:25:19.621 80 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2
node3 19.477s 2025-09-24 20:25:19.622 81 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node2 19.574s 2025-09-24 20:25:19.719 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2
node2 19.576s 2025-09-24 20:25:19.721 80 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node4 19.640s 2025-09-24 20:25:19.785 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node4 19.643s 2025-09-24 20:25:19.788 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-24T20:25:17.779607205Z Next consensus number: 10 Legacy running event hash: 460a7f734e307d9bc7fac20f756ea40059aff2e3abc73f807f6010c4a8988f839e991c778a3beb720aec77b18ae32f62 Legacy running event mnemonic: scout-unveil-ship-recall Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: a0e85970f3953ca7773a50049cd668d25bf3ad73169d362b63ac2b99163887a8b31b58141edae09ed5e7a1b1bbba1142 (root) ConsistencyTestingToolState / toast-come-miracle-neutral 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 spring-salmon-old-pole 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node4 19.678s 2025-09-24 20:25:19.823 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/24/2025-09-24T20+25+16.448491026Z_seq0_minr1_maxr501_orgn0.pces
node4 19.679s 2025-09-24 20:25:19.824 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/4/2025/09/24/2025-09-24T20+25+16.448491026Z_seq0_minr1_maxr501_orgn0.pces
node4 19.679s 2025-09-24 20:25:19.824 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 19.680s 2025-09-24 20:25:19.825 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 19.685s 2025-09-24 20:25:19.830 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 19.700s 2025-09-24 20:25:19.845 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node1 19.703s 2025-09-24 20:25:19.848 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-24T20:25:17.779607205Z Next consensus number: 10 Legacy running event hash: 460a7f734e307d9bc7fac20f756ea40059aff2e3abc73f807f6010c4a8988f839e991c778a3beb720aec77b18ae32f62 Legacy running event mnemonic: scout-unveil-ship-recall Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: a0e85970f3953ca7773a50049cd668d25bf3ad73169d362b63ac2b99163887a8b31b58141edae09ed5e7a1b1bbba1142 (root) ConsistencyTestingToolState / toast-come-miracle-neutral 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 spring-salmon-old-pole 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node0 19.712s 2025-09-24 20:25:19.857 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node0 19.715s 2025-09-24 20:25:19.860 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-24T20:25:17.779607205Z Next consensus number: 10 Legacy running event hash: 460a7f734e307d9bc7fac20f756ea40059aff2e3abc73f807f6010c4a8988f839e991c778a3beb720aec77b18ae32f62 Legacy running event mnemonic: scout-unveil-ship-recall Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: a0e85970f3953ca7773a50049cd668d25bf3ad73169d362b63ac2b99163887a8b31b58141edae09ed5e7a1b1bbba1142 (root) ConsistencyTestingToolState / toast-come-miracle-neutral 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 spring-salmon-old-pole 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node3 19.724s 2025-09-24 20:25:19.869 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node3 19.727s 2025-09-24 20:25:19.872 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-24T20:25:17.779607205Z Next consensus number: 10 Legacy running event hash: 460a7f734e307d9bc7fac20f756ea40059aff2e3abc73f807f6010c4a8988f839e991c778a3beb720aec77b18ae32f62 Legacy running event mnemonic: scout-unveil-ship-recall Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: a0e85970f3953ca7773a50049cd668d25bf3ad73169d362b63ac2b99163887a8b31b58141edae09ed5e7a1b1bbba1142 (root) ConsistencyTestingToolState / toast-come-miracle-neutral 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 spring-salmon-old-pole 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node1 19.737s 2025-09-24 20:25:19.882 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+25+16.237543055Z_seq0_minr1_maxr501_orgn0.pces
node1 19.737s 2025-09-24 20:25:19.882 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+25+16.237543055Z_seq0_minr1_maxr501_orgn0.pces
node1 19.737s 2025-09-24 20:25:19.882 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 19.738s 2025-09-24 20:25:19.883 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 19.743s 2025-09-24 20:25:19.888 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 19.746s 2025-09-24 20:25:19.891 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+25+16.311647372Z_seq0_minr1_maxr501_orgn0.pces
node0 19.747s 2025-09-24 20:25:19.892 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+25+16.311647372Z_seq0_minr1_maxr501_orgn0.pces
node0 19.747s 2025-09-24 20:25:19.892 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 19.748s 2025-09-24 20:25:19.893 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 19.753s 2025-09-24 20:25:19.898 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 19.760s 2025-09-24 20:25:19.905 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+25+16.374322506Z_seq0_minr1_maxr501_orgn0.pces
node3 19.760s 2025-09-24 20:25:19.905 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+25+16.374322506Z_seq0_minr1_maxr501_orgn0.pces
node3 19.760s 2025-09-24 20:25:19.905 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 19.761s 2025-09-24 20:25:19.906 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 19.767s 2025-09-24 20:25:19.912 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 19.829s 2025-09-24 20:25:19.974 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node2 19.832s 2025-09-24 20:25:19.977 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-24T20:25:17.779607205Z Next consensus number: 10 Legacy running event hash: 460a7f734e307d9bc7fac20f756ea40059aff2e3abc73f807f6010c4a8988f839e991c778a3beb720aec77b18ae32f62 Legacy running event mnemonic: scout-unveil-ship-recall Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: a0e85970f3953ca7773a50049cd668d25bf3ad73169d362b63ac2b99163887a8b31b58141edae09ed5e7a1b1bbba1142 (root) ConsistencyTestingToolState / toast-come-miracle-neutral 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 spring-salmon-old-pole 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node2 19.867s 2025-09-24 20:25:20.012 119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+25+16.619264354Z_seq0_minr1_maxr501_orgn0.pces
node2 19.868s 2025-09-24 20:25:20.013 120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+25+16.619264354Z_seq0_minr1_maxr501_orgn0.pces
node2 19.868s 2025-09-24 20:25:20.013 121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 19.869s 2025-09-24 20:25:20.014 122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 19.875s 2025-09-24 20:25:20.020 123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 20.304s 2025-09-24 20:25:20.449 125 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 1.8 s in CHECKING. Now in ACTIVE
node3 1m 1.395s 2025-09-24 20:26:01.540 905 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 74 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 1m 1.407s 2025-09-24 20:26:01.552 893 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 74 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 1m 1.441s 2025-09-24 20:26:01.586 881 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 74 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 1m 1.450s 2025-09-24 20:26:01.595 903 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 74 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 1m 1.496s 2025-09-24 20:26:01.641 905 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 74 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 1m 1.775s 2025-09-24 20:26:01.920 908 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 74 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/74
node3 1m 1.775s 2025-09-24 20:26:01.920 909 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 74
node1 1m 1.816s 2025-09-24 20:26:01.961 896 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 74 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/74
node1 1m 1.817s 2025-09-24 20:26:01.962 897 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 74
node4 1m 1.838s 2025-09-24 20:26:01.983 884 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 74 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/74
node4 1m 1.838s 2025-09-24 20:26:01.983 885 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 74
node3 1m 1.858s 2025-09-24 20:26:02.003 940 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 74
node3 1m 1.862s 2025-09-24 20:26:02.007 941 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 74 Timestamp: 2025-09-24T20:26:00.135192915Z Next consensus number: 1800 Legacy running event hash: d0bd78d252fda930e14c7dd894b3972baa51d4ad26b5ec60de0b8a444cf830d9b840c74b8718a41c51b25199af282042 Legacy running event mnemonic: trick-bottom-garden-sure Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1722860217 Root hash: c18d362e54d87a28a4ec1ffebce82f27a9b80d51416e4165e7555b4379e8cee25927b24f2facb785bdf2a5dd57917ec9 (root) ConsistencyTestingToolState / solar-shift-only-eagle 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 basic-account-disease-leaf 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 3791943449832949703 /3 explain-photo-coffee-copper 4 StringLeaf 73 /4 width-control-plate-shrimp
node3 1m 1.870s 2025-09-24 20:26:02.015 942 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+25+16.374322506Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 1.870s 2025-09-24 20:26:02.015 943 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 47 File: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+25+16.374322506Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 1.870s 2025-09-24 20:26:02.015 944 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 1m 1.872s 2025-09-24 20:26:02.017 945 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 1m 1.872s 2025-09-24 20:26:02.017 946 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 74 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/74 {"round":74,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/74/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 1m 1.873s 2025-09-24 20:26:02.018 906 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 74 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/74
node2 1m 1.874s 2025-09-24 20:26:02.019 907 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 74
node1 1m 1.907s 2025-09-24 20:26:02.052 928 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 74
node1 1m 1.910s 2025-09-24 20:26:02.055 929 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 74 Timestamp: 2025-09-24T20:26:00.135192915Z Next consensus number: 1800 Legacy running event hash: d0bd78d252fda930e14c7dd894b3972baa51d4ad26b5ec60de0b8a444cf830d9b840c74b8718a41c51b25199af282042 Legacy running event mnemonic: trick-bottom-garden-sure Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1722860217 Root hash: c18d362e54d87a28a4ec1ffebce82f27a9b80d51416e4165e7555b4379e8cee25927b24f2facb785bdf2a5dd57917ec9 (root) ConsistencyTestingToolState / solar-shift-only-eagle 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 basic-account-disease-leaf 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 3791943449832949703 /3 explain-photo-coffee-copper 4 StringLeaf 73 /4 width-control-plate-shrimp
node4 1m 1.917s 2025-09-24 20:26:02.062 916 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 74
node1 1m 1.919s 2025-09-24 20:26:02.064 930 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+25+16.237543055Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 1.919s 2025-09-24 20:26:02.064 931 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 47 File: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+25+16.237543055Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 1.919s 2025-09-24 20:26:02.064 932 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 1m 1.920s 2025-09-24 20:26:02.065 917 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 74 Timestamp: 2025-09-24T20:26:00.135192915Z Next consensus number: 1800 Legacy running event hash: d0bd78d252fda930e14c7dd894b3972baa51d4ad26b5ec60de0b8a444cf830d9b840c74b8718a41c51b25199af282042 Legacy running event mnemonic: trick-bottom-garden-sure Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1722860217 Root hash: c18d362e54d87a28a4ec1ffebce82f27a9b80d51416e4165e7555b4379e8cee25927b24f2facb785bdf2a5dd57917ec9 (root) ConsistencyTestingToolState / solar-shift-only-eagle 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 basic-account-disease-leaf 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 3791943449832949703 /3 explain-photo-coffee-copper 4 StringLeaf 73 /4 width-control-plate-shrimp
node1 1m 1.921s 2025-09-24 20:26:02.066 933 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 1m 1.922s 2025-09-24 20:26:02.067 934 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 74 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/74 {"round":74,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/74/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 1m 1.929s 2025-09-24 20:26:02.074 918 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/24/2025-09-24T20+25+16.448491026Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 1.930s 2025-09-24 20:26:02.075 919 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 47 File: data/saved/preconsensus-events/4/2025/09/24/2025-09-24T20+25+16.448491026Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 1.930s 2025-09-24 20:26:02.075 920 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 1m 1.931s 2025-09-24 20:26:02.076 921 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 1m 1.932s 2025-09-24 20:26:02.077 922 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 74 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/74 {"round":74,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/74/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 1m 1.943s 2025-09-24 20:26:02.088 908 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 74 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/74
node0 1m 1.944s 2025-09-24 20:26:02.089 909 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 74
node2 1m 1.956s 2025-09-24 20:26:02.101 938 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 74
node2 1m 1.959s 2025-09-24 20:26:02.104 939 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 74 Timestamp: 2025-09-24T20:26:00.135192915Z Next consensus number: 1800 Legacy running event hash: d0bd78d252fda930e14c7dd894b3972baa51d4ad26b5ec60de0b8a444cf830d9b840c74b8718a41c51b25199af282042 Legacy running event mnemonic: trick-bottom-garden-sure Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1722860217 Root hash: c18d362e54d87a28a4ec1ffebce82f27a9b80d51416e4165e7555b4379e8cee25927b24f2facb785bdf2a5dd57917ec9 (root) ConsistencyTestingToolState / solar-shift-only-eagle 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 basic-account-disease-leaf 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 3791943449832949703 /3 explain-photo-coffee-copper 4 StringLeaf 73 /4 width-control-plate-shrimp
node2 1m 1.967s 2025-09-24 20:26:02.112 940 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+25+16.619264354Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 1.967s 2025-09-24 20:26:02.112 941 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 47 File: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+25+16.619264354Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 1.967s 2025-09-24 20:26:02.112 942 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 1m 1.969s 2025-09-24 20:26:02.114 943 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 1m 1.969s 2025-09-24 20:26:02.114 944 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 74 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/74 {"round":74,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/74/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 1m 2.025s 2025-09-24 20:26:02.170 940 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 74
node0 1m 2.027s 2025-09-24 20:26:02.172 941 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 74 Timestamp: 2025-09-24T20:26:00.135192915Z Next consensus number: 1800 Legacy running event hash: d0bd78d252fda930e14c7dd894b3972baa51d4ad26b5ec60de0b8a444cf830d9b840c74b8718a41c51b25199af282042 Legacy running event mnemonic: trick-bottom-garden-sure Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1722860217 Root hash: c18d362e54d87a28a4ec1ffebce82f27a9b80d51416e4165e7555b4379e8cee25927b24f2facb785bdf2a5dd57917ec9 (root) ConsistencyTestingToolState / solar-shift-only-eagle 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 basic-account-disease-leaf 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 3791943449832949703 /3 explain-photo-coffee-copper 4 StringLeaf 73 /4 width-control-plate-shrimp
node0 1m 2.036s 2025-09-24 20:26:02.181 950 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+25+16.311647372Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 2.036s 2025-09-24 20:26:02.181 951 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 47 File: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+25+16.311647372Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 2.036s 2025-09-24 20:26:02.181 952 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 1m 2.038s 2025-09-24 20:26:02.183 953 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 1m 2.039s 2025-09-24 20:26:02.184 954 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 74 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/74 {"round":74,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/74/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 2m 1.605s 2025-09-24 20:27:01.750 1994 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 167 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 2m 1.659s 2025-09-24 20:27:01.804 2000 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 167 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 2m 1.692s 2025-09-24 20:27:01.837 1952 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 167 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 2m 1.714s 2025-09-24 20:27:01.859 1990 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 167 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 2m 1.794s 2025-09-24 20:27:01.939 2000 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 167 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 2m 2.039s 2025-09-24 20:27:02.184 2003 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 167 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/167
node0 2m 2.039s 2025-09-24 20:27:02.184 2004 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 167
node2 2m 2.085s 2025-09-24 20:27:02.230 2003 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 167 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/167
node2 2m 2.086s 2025-09-24 20:27:02.231 2004 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 167
node0 2m 2.131s 2025-09-24 20:27:02.276 2035 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 167
node0 2m 2.133s 2025-09-24 20:27:02.278 2036 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 167 Timestamp: 2025-09-24T20:27:00.309380Z Next consensus number: 4329 Legacy running event hash: 51c9927c212503a76ce269a60a01593c92809d02abbc8c5767d1df3442a32f11250b2b0754628329345a1a47183ce0d6 Legacy running event mnemonic: decrease-rack-caution-insane Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1367769167 Root hash: 2d8e78b6d41ccdd85c942318e09e47bd0bb2ea7609ae842f3fcaad4f01c91e2a36874a1159dca539cb302671cfc522a9 (root) ConsistencyTestingToolState / range-bitter-lady-eagle 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 vast-front-future-prefer 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf -4986745221054146073 /3 strong-prosper-riot-ozone 4 StringLeaf 166 /4 wolf-domain-child-slam
node0 2m 2.140s 2025-09-24 20:27:02.285 2037 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+25+16.311647372Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 2.141s 2025-09-24 20:27:02.286 2038 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 139 File: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+25+16.311647372Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 2.141s 2025-09-24 20:27:02.286 2039 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 2m 2.144s 2025-09-24 20:27:02.289 2040 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 2m 2.144s 2025-09-24 20:27:02.289 2041 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 167 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/167 {"round":167,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/167/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 2m 2.178s 2025-09-24 20:27:02.323 2035 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 167
node2 2m 2.180s 2025-09-24 20:27:02.325 2036 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 167 Timestamp: 2025-09-24T20:27:00.309380Z Next consensus number: 4329 Legacy running event hash: 51c9927c212503a76ce269a60a01593c92809d02abbc8c5767d1df3442a32f11250b2b0754628329345a1a47183ce0d6 Legacy running event mnemonic: decrease-rack-caution-insane Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1367769167 Root hash: 2d8e78b6d41ccdd85c942318e09e47bd0bb2ea7609ae842f3fcaad4f01c91e2a36874a1159dca539cb302671cfc522a9 (root) ConsistencyTestingToolState / range-bitter-lady-eagle 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 vast-front-future-prefer 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf -4986745221054146073 /3 strong-prosper-riot-ozone 4 StringLeaf 166 /4 wolf-domain-child-slam
node2 2m 2.188s 2025-09-24 20:27:02.333 2037 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+25+16.619264354Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 2.188s 2025-09-24 20:27:02.333 2038 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 139 File: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+25+16.619264354Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 2.188s 2025-09-24 20:27:02.333 2039 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 2m 2.191s 2025-09-24 20:27:02.336 2040 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 2m 2.192s 2025-09-24 20:27:02.337 2041 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 167 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/167 {"round":167,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/167/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 2m 2.199s 2025-09-24 20:27:02.344 1955 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 167 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/167
node4 2m 2.200s 2025-09-24 20:27:02.345 1956 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 167
node1 2m 2.272s 2025-09-24 20:27:02.417 1993 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 167 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/167
node1 2m 2.273s 2025-09-24 20:27:02.418 1994 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 167
node4 2m 2.291s 2025-09-24 20:27:02.436 1987 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 167
node4 2m 2.294s 2025-09-24 20:27:02.439 1988 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 167 Timestamp: 2025-09-24T20:27:00.309380Z Next consensus number: 4329 Legacy running event hash: 51c9927c212503a76ce269a60a01593c92809d02abbc8c5767d1df3442a32f11250b2b0754628329345a1a47183ce0d6 Legacy running event mnemonic: decrease-rack-caution-insane Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1367769167 Root hash: 2d8e78b6d41ccdd85c942318e09e47bd0bb2ea7609ae842f3fcaad4f01c91e2a36874a1159dca539cb302671cfc522a9 (root) ConsistencyTestingToolState / range-bitter-lady-eagle 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 vast-front-future-prefer 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf -4986745221054146073 /3 strong-prosper-riot-ozone 4 StringLeaf 166 /4 wolf-domain-child-slam
node4 2m 2.302s 2025-09-24 20:27:02.447 1989 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/24/2025-09-24T20+25+16.448491026Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 2.303s 2025-09-24 20:27:02.448 1990 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 139 File: data/saved/preconsensus-events/4/2025/09/24/2025-09-24T20+25+16.448491026Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 2.303s 2025-09-24 20:27:02.448 1991 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 2m 2.306s 2025-09-24 20:27:02.451 1992 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 2m 2.307s 2025-09-24 20:27:02.452 1993 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 167 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/167 {"round":167,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/167/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 2m 2.347s 2025-09-24 20:27:02.492 1997 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 167 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/167
node3 2m 2.348s 2025-09-24 20:27:02.493 1998 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 167
node1 2m 2.365s 2025-09-24 20:27:02.510 2037 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 167
node1 2m 2.367s 2025-09-24 20:27:02.512 2038 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 167 Timestamp: 2025-09-24T20:27:00.309380Z Next consensus number: 4329 Legacy running event hash: 51c9927c212503a76ce269a60a01593c92809d02abbc8c5767d1df3442a32f11250b2b0754628329345a1a47183ce0d6 Legacy running event mnemonic: decrease-rack-caution-insane Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1367769167 Root hash: 2d8e78b6d41ccdd85c942318e09e47bd0bb2ea7609ae842f3fcaad4f01c91e2a36874a1159dca539cb302671cfc522a9 (root) ConsistencyTestingToolState / range-bitter-lady-eagle 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 vast-front-future-prefer 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf -4986745221054146073 /3 strong-prosper-riot-ozone 4 StringLeaf 166 /4 wolf-domain-child-slam
node1 2m 2.374s 2025-09-24 20:27:02.519 2039 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+25+16.237543055Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 2.374s 2025-09-24 20:27:02.519 2040 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 139 File: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+25+16.237543055Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 2.375s 2025-09-24 20:27:02.520 2041 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 2m 2.378s 2025-09-24 20:27:02.523 2042 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 2m 2.378s 2025-09-24 20:27:02.523 2043 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 167 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/167 {"round":167,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/167/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 2m 2.441s 2025-09-24 20:27:02.586 2037 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 167
node3 2m 2.443s 2025-09-24 20:27:02.588 2038 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 167 Timestamp: 2025-09-24T20:27:00.309380Z Next consensus number: 4329 Legacy running event hash: 51c9927c212503a76ce269a60a01593c92809d02abbc8c5767d1df3442a32f11250b2b0754628329345a1a47183ce0d6 Legacy running event mnemonic: decrease-rack-caution-insane Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1367769167 Root hash: 2d8e78b6d41ccdd85c942318e09e47bd0bb2ea7609ae842f3fcaad4f01c91e2a36874a1159dca539cb302671cfc522a9 (root) ConsistencyTestingToolState / range-bitter-lady-eagle 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 vast-front-future-prefer 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf -4986745221054146073 /3 strong-prosper-riot-ozone 4 StringLeaf 166 /4 wolf-domain-child-slam
node3 2m 2.451s 2025-09-24 20:27:02.596 2039 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+25+16.374322506Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 2.451s 2025-09-24 20:27:02.596 2040 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 139 File: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+25+16.374322506Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 2.451s 2025-09-24 20:27:02.596 2041 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 2m 2.455s 2025-09-24 20:27:02.600 2042 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 2m 2.455s 2025-09-24 20:27:02.600 2043 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 167 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/167 {"round":167,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/167/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 3m 1.104s 2025-09-24 20:28:01.249 3077 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 260 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 3m 1.167s 2025-09-24 20:28:01.312 3063 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 260 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 3m 1.191s 2025-09-24 20:28:01.336 3071 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 260 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 3m 1.370s 2025-09-24 20:28:01.515 3087 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 260 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 3m 1.399s 2025-09-24 20:28:01.544 3029 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 260 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 3m 1.507s 2025-09-24 20:28:01.652 3090 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 260 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/260
node2 3m 1.508s 2025-09-24 20:28:01.653 3091 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 260
node0 3m 1.513s 2025-09-24 20:28:01.658 3080 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 260 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/260
node0 3m 1.513s 2025-09-24 20:28:01.658 3081 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 260
node1 3m 1.583s 2025-09-24 20:28:01.728 3066 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 260 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/260
node1 3m 1.583s 2025-09-24 20:28:01.728 3067 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 260
node0 3m 1.601s 2025-09-24 20:28:01.746 3120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 260
node0 3m 1.603s 2025-09-24 20:28:01.748 3121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 260 Timestamp: 2025-09-24T20:28:00.045236907Z Next consensus number: 6832 Legacy running event hash: c81ffe55f19259314fd6ee1f96d133d05572d4acdb3b66d1e8e79af6abfba5f68efe2d3b084cf34bec5250edddf27ee3 Legacy running event mnemonic: wire-cube-soccer-change Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1845518224 Root hash: 352ea036ff0daac342e6efe71167c2593c5fc788cf491991714bbc630940f9ca628e919135ba038237256405d7ec1072 (root) ConsistencyTestingToolState / place-welcome-bunker-police 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wool-mad-laptop-tip 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 3744925396723063960 /3 pretty-motor-can-dash 4 StringLeaf 259 /4 chronic-core-repair-exotic
node0 3m 1.609s 2025-09-24 20:28:01.754 3122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+25+16.311647372Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 1.609s 2025-09-24 20:28:01.754 3123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 233 File: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+25+16.311647372Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 1.609s 2025-09-24 20:28:01.754 3124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 3m 1.610s 2025-09-24 20:28:01.755 3122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 260
node2 3m 1.612s 2025-09-24 20:28:01.757 3123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 260 Timestamp: 2025-09-24T20:28:00.045236907Z Next consensus number: 6832 Legacy running event hash: c81ffe55f19259314fd6ee1f96d133d05572d4acdb3b66d1e8e79af6abfba5f68efe2d3b084cf34bec5250edddf27ee3 Legacy running event mnemonic: wire-cube-soccer-change Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1845518224 Root hash: 352ea036ff0daac342e6efe71167c2593c5fc788cf491991714bbc630940f9ca628e919135ba038237256405d7ec1072 (root) ConsistencyTestingToolState / place-welcome-bunker-police 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wool-mad-laptop-tip 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 3744925396723063960 /3 pretty-motor-can-dash 4 StringLeaf 259 /4 chronic-core-repair-exotic
node3 3m 1.613s 2025-09-24 20:28:01.758 3074 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 260 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/260
node0 3m 1.614s 2025-09-24 20:28:01.759 3125 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 3m 1.614s 2025-09-24 20:28:01.759 3075 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 260
node0 3m 1.615s 2025-09-24 20:28:01.760 3126 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 260 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/260 {"round":260,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/260/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 3m 1.620s 2025-09-24 20:28:01.765 3124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+25+16.619264354Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 1.621s 2025-09-24 20:28:01.766 3125 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 233 File: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+25+16.619264354Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 1.621s 2025-09-24 20:28:01.766 3126 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 3m 1.626s 2025-09-24 20:28:01.771 3135 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 3m 1.627s 2025-09-24 20:28:01.772 3136 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 260 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/260 {"round":260,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/260/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 3m 1.654s 2025-09-24 20:28:01.799 3032 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 260 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/260
node4 3m 1.655s 2025-09-24 20:28:01.800 3033 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 260
node1 3m 1.668s 2025-09-24 20:28:01.813 3098 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 260
node1 3m 1.670s 2025-09-24 20:28:01.815 3099 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 260 Timestamp: 2025-09-24T20:28:00.045236907Z Next consensus number: 6832 Legacy running event hash: c81ffe55f19259314fd6ee1f96d133d05572d4acdb3b66d1e8e79af6abfba5f68efe2d3b084cf34bec5250edddf27ee3 Legacy running event mnemonic: wire-cube-soccer-change Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1845518224 Root hash: 352ea036ff0daac342e6efe71167c2593c5fc788cf491991714bbc630940f9ca628e919135ba038237256405d7ec1072 (root) ConsistencyTestingToolState / place-welcome-bunker-police 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wool-mad-laptop-tip 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 3744925396723063960 /3 pretty-motor-can-dash 4 StringLeaf 259 /4 chronic-core-repair-exotic
node1 3m 1.676s 2025-09-24 20:28:01.821 3100 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+25+16.237543055Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 1.677s 2025-09-24 20:28:01.822 3101 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 233 File: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+25+16.237543055Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 1.677s 2025-09-24 20:28:01.822 3102 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 3m 1.682s 2025-09-24 20:28:01.827 3103 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 3m 1.682s 2025-09-24 20:28:01.827 3104 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 260 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/260 {"round":260,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/260/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 3m 1.703s 2025-09-24 20:28:01.848 3118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 260
node3 3m 1.705s 2025-09-24 20:28:01.850 3119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 260 Timestamp: 2025-09-24T20:28:00.045236907Z Next consensus number: 6832 Legacy running event hash: c81ffe55f19259314fd6ee1f96d133d05572d4acdb3b66d1e8e79af6abfba5f68efe2d3b084cf34bec5250edddf27ee3 Legacy running event mnemonic: wire-cube-soccer-change Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1845518224 Root hash: 352ea036ff0daac342e6efe71167c2593c5fc788cf491991714bbc630940f9ca628e919135ba038237256405d7ec1072 (root) ConsistencyTestingToolState / place-welcome-bunker-police 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wool-mad-laptop-tip 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 3744925396723063960 /3 pretty-motor-can-dash 4 StringLeaf 259 /4 chronic-core-repair-exotic
node3 3m 1.713s 2025-09-24 20:28:01.858 3120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+25+16.374322506Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 1.713s 2025-09-24 20:28:01.858 3121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 233 File: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+25+16.374322506Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 1.713s 2025-09-24 20:28:01.858 3122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 3m 1.718s 2025-09-24 20:28:01.863 3123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 3m 1.719s 2025-09-24 20:28:01.864 3124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 260 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/260 {"round":260,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/260/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 3m 1.744s 2025-09-24 20:28:01.889 3064 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 260
node4 3m 1.746s 2025-09-24 20:28:01.891 3065 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 260 Timestamp: 2025-09-24T20:28:00.045236907Z Next consensus number: 6832 Legacy running event hash: c81ffe55f19259314fd6ee1f96d133d05572d4acdb3b66d1e8e79af6abfba5f68efe2d3b084cf34bec5250edddf27ee3 Legacy running event mnemonic: wire-cube-soccer-change Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1845518224 Root hash: 352ea036ff0daac342e6efe71167c2593c5fc788cf491991714bbc630940f9ca628e919135ba038237256405d7ec1072 (root) ConsistencyTestingToolState / place-welcome-bunker-police 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wool-mad-laptop-tip 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 3744925396723063960 /3 pretty-motor-can-dash 4 StringLeaf 259 /4 chronic-core-repair-exotic
node4 3m 1.753s 2025-09-24 20:28:01.898 3066 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/24/2025-09-24T20+25+16.448491026Z_seq0_minr1_maxr501_orgn0.pces
node4 3m 1.753s 2025-09-24 20:28:01.898 3067 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 233 File: data/saved/preconsensus-events/4/2025/09/24/2025-09-24T20+25+16.448491026Z_seq0_minr1_maxr501_orgn0.pces
node4 3m 1.753s 2025-09-24 20:28:01.898 3068 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 3m 1.759s 2025-09-24 20:28:01.904 3069 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 3m 1.759s 2025-09-24 20:28:01.904 3070 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 260 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/260 {"round":260,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/260/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 3m 11.736s 2025-09-24 20:28:11.881 3300 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 0 to 4>> NetworkUtils: Connection broken: 0 -> 4
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.SentInitiate.transition(SentInitiate.java:73) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node3 3m 11.738s 2025-09-24 20:28:11.883 3289 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 3 to 4>> NetworkUtils: Connection broken: 3 -> 4
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-24T20:28:11.881647595Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-24T20:28:11.881647595Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:180) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readBoolean(DataInputStream.java:255) at org.hiero.base.io.streams.AugmentedDataInputStream.readBoolean(AugmentedDataInputStream.java:137) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readMyTipsTheyHave$7(SyncUtils.java:163) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more
node1 3m 11.740s 2025-09-24 20:28:11.885 3279 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 1 to 4>> NetworkUtils: Connection broken: 1 -> 4
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.SentKeepalive.transition(SentKeepalive.java:44) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node2 3m 11.740s 2025-09-24 20:28:11.885 3296 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 2 to 4>> NetworkUtils: Connection broken: 2 -> 4
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-24T20:28:11.882409793Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-24T20:28:11.882409793Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more
node1 4m 1.727s 2025-09-24 20:29:01.872 4116 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 348 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 4m 1.735s 2025-09-24 20:29:01.880 4104 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 348 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 4m 1.854s 2025-09-24 20:29:01.999 4110 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 348 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 4m 1.869s 2025-09-24 20:29:02.014 4124 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 348 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 4m 2.135s 2025-09-24 20:29:02.280 4129 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 348 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/348
node1 4m 2.135s 2025-09-24 20:29:02.280 4130 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 348
node0 4m 2.145s 2025-09-24 20:29:02.290 4117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 348 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/348
node0 4m 2.146s 2025-09-24 20:29:02.291 4118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 348
node3 4m 2.215s 2025-09-24 20:29:02.360 4123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 348 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/348
node3 4m 2.216s 2025-09-24 20:29:02.361 4124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 348
node2 4m 2.217s 2025-09-24 20:29:02.362 4127 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 348 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/348
node1 4m 2.218s 2025-09-24 20:29:02.363 4165 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 348
node2 4m 2.218s 2025-09-24 20:29:02.363 4128 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 348
node1 4m 2.220s 2025-09-24 20:29:02.365 4166 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 348 Timestamp: 2025-09-24T20:29:00.603246Z Next consensus number: 8557 Legacy running event hash: 592fd2cdefa8ec2a2709ded08b016c245a36664ec0d8e9ab8cefbfb318d27780b52fdcca574d7e4b16845bd8c9040575 Legacy running event mnemonic: chalk-please-arch-kiss Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1684012177 Root hash: 31d150b8c04ce11dbf5ab4627561f699f00dbee3a21b95634bba1ae3551c2ce5cda2e803459dd9ba1559ebbd8accc0ca (root) ConsistencyTestingToolState / demand-blade-walk-symbol 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 memory-clap-verify-doll 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf -946711140097080006 /3 north-song-goddess-game 4 StringLeaf 347 /4 mad-dog-valid-floor
node1 4m 2.226s 2025-09-24 20:29:02.371 4167 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+25+16.237543055Z_seq0_minr1_maxr501_orgn0.pces
node1 4m 2.227s 2025-09-24 20:29:02.372 4168 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 321 File: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+25+16.237543055Z_seq0_minr1_maxr501_orgn0.pces
node1 4m 2.227s 2025-09-24 20:29:02.372 4169 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 4m 2.228s 2025-09-24 20:29:02.373 4157 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 348
node0 4m 2.230s 2025-09-24 20:29:02.375 4158 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 348 Timestamp: 2025-09-24T20:29:00.603246Z Next consensus number: 8557 Legacy running event hash: 592fd2cdefa8ec2a2709ded08b016c245a36664ec0d8e9ab8cefbfb318d27780b52fdcca574d7e4b16845bd8c9040575 Legacy running event mnemonic: chalk-please-arch-kiss Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1684012177 Root hash: 31d150b8c04ce11dbf5ab4627561f699f00dbee3a21b95634bba1ae3551c2ce5cda2e803459dd9ba1559ebbd8accc0ca (root) ConsistencyTestingToolState / demand-blade-walk-symbol 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 memory-clap-verify-doll 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf -946711140097080006 /3 north-song-goddess-game 4 StringLeaf 347 /4 mad-dog-valid-floor
node1 4m 2.233s 2025-09-24 20:29:02.378 4170 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 4m 2.233s 2025-09-24 20:29:02.378 4171 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 348 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/348 {"round":348,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/348/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 4m 2.237s 2025-09-24 20:29:02.382 4159 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+25+16.311647372Z_seq0_minr1_maxr501_orgn0.pces
node0 4m 2.237s 2025-09-24 20:29:02.382 4160 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 321 File: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+25+16.311647372Z_seq0_minr1_maxr501_orgn0.pces
node0 4m 2.237s 2025-09-24 20:29:02.382 4161 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 4m 2.243s 2025-09-24 20:29:02.388 4162 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 4m 2.244s 2025-09-24 20:29:02.389 4163 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 348 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/348 {"round":348,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/348/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 4m 2.297s 2025-09-24 20:29:02.442 4155 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 348
node3 4m 2.298s 2025-09-24 20:29:02.443 4164 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 348 Timestamp: 2025-09-24T20:29:00.603246Z Next consensus number: 8557 Legacy running event hash: 592fd2cdefa8ec2a2709ded08b016c245a36664ec0d8e9ab8cefbfb318d27780b52fdcca574d7e4b16845bd8c9040575 Legacy running event mnemonic: chalk-please-arch-kiss Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1684012177 Root hash: 31d150b8c04ce11dbf5ab4627561f699f00dbee3a21b95634bba1ae3551c2ce5cda2e803459dd9ba1559ebbd8accc0ca (root) ConsistencyTestingToolState / demand-blade-walk-symbol 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 memory-clap-verify-doll 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf -946711140097080006 /3 north-song-goddess-game 4 StringLeaf 347 /4 mad-dog-valid-floor
node3 4m 2.304s 2025-09-24 20:29:02.449 4165 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+25+16.374322506Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 2.304s 2025-09-24 20:29:02.449 4166 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 321 File: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+25+16.374322506Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 2.304s 2025-09-24 20:29:02.449 4167 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 4m 2.311s 2025-09-24 20:29:02.456 4168 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 4m 2.311s 2025-09-24 20:29:02.456 4169 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 348 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/348 {"round":348,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/348/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 4m 2.314s 2025-09-24 20:29:02.459 4163 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 348
node2 4m 2.316s 2025-09-24 20:29:02.461 4164 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 348 Timestamp: 2025-09-24T20:29:00.603246Z Next consensus number: 8557 Legacy running event hash: 592fd2cdefa8ec2a2709ded08b016c245a36664ec0d8e9ab8cefbfb318d27780b52fdcca574d7e4b16845bd8c9040575 Legacy running event mnemonic: chalk-please-arch-kiss Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1684012177 Root hash: 31d150b8c04ce11dbf5ab4627561f699f00dbee3a21b95634bba1ae3551c2ce5cda2e803459dd9ba1559ebbd8accc0ca (root) ConsistencyTestingToolState / demand-blade-walk-symbol 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 memory-clap-verify-doll 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf -946711140097080006 /3 north-song-goddess-game 4 StringLeaf 347 /4 mad-dog-valid-floor
node2 4m 2.324s 2025-09-24 20:29:02.469 4165 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+25+16.619264354Z_seq0_minr1_maxr501_orgn0.pces
node2 4m 2.324s 2025-09-24 20:29:02.469 4166 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 321 File: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+25+16.619264354Z_seq0_minr1_maxr501_orgn0.pces
node2 4m 2.324s 2025-09-24 20:29:02.469 4167 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 4m 2.331s 2025-09-24 20:29:02.476 4168 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 4m 2.331s 2025-09-24 20:29:02.476 4169 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 348 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/348 {"round":348,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/348/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 5m 1.805s 2025-09-24 20:30:01.950 5176 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 436 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 5m 1.926s 2025-09-24 20:30:02.071 5160 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 436 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 5m 2.027s 2025-09-24 20:30:02.172 5160 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 436 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 5m 2.084s 2025-09-24 20:30:02.229 5182 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 436 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 5m 2.234s 2025-09-24 20:30:02.379 5185 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 436 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/436
node3 5m 2.234s 2025-09-24 20:30:02.379 5186 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 436
node2 5m 2.304s 2025-09-24 20:30:02.449 5163 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 436 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/436
node2 5m 2.305s 2025-09-24 20:30:02.450 5164 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 436
node3 5m 2.322s 2025-09-24 20:30:02.467 5221 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 436
node3 5m 2.325s 2025-09-24 20:30:02.470 5222 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 436 Timestamp: 2025-09-24T20:30:00.677717Z Next consensus number: 10089 Legacy running event hash: 920ac3f509c17ab6df59ebb15d495ef0aee74c4c93111dc169db4926cb61b9bda94492d3031ee9365009818d15073bc6 Legacy running event mnemonic: salute-lift-deal-fine Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1777943994 Root hash: adada557804909b40c93179534a92d59ea66fdc9996d174e3a0102d131f7a4e5dd5ecd1ce5f6a13423f67cdd083ab23e (root) ConsistencyTestingToolState / engine-festival-want-custom 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wear-height-payment-oyster 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf -1583837310948035076 /3 exact-doctor-apology-alter 4 StringLeaf 435 /4 caught-knife-shiver-candy
node3 5m 2.332s 2025-09-24 20:30:02.477 5223 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+25+16.374322506Z_seq0_minr1_maxr501_orgn0.pces
node3 5m 2.332s 2025-09-24 20:30:02.477 5224 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 409 File: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+25+16.374322506Z_seq0_minr1_maxr501_orgn0.pces
node3 5m 2.332s 2025-09-24 20:30:02.477 5225 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 5m 2.339s 2025-09-24 20:30:02.484 5226 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 5m 2.340s 2025-09-24 20:30:02.485 5227 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 436 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/436 {"round":436,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/436/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 5m 2.342s 2025-09-24 20:30:02.487 5228 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2
node2 5m 2.392s 2025-09-24 20:30:02.537 5195 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 436
node2 5m 2.395s 2025-09-24 20:30:02.540 5196 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 436 Timestamp: 2025-09-24T20:30:00.677717Z Next consensus number: 10089 Legacy running event hash: 920ac3f509c17ab6df59ebb15d495ef0aee74c4c93111dc169db4926cb61b9bda94492d3031ee9365009818d15073bc6 Legacy running event mnemonic: salute-lift-deal-fine Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1777943994 Root hash: adada557804909b40c93179534a92d59ea66fdc9996d174e3a0102d131f7a4e5dd5ecd1ce5f6a13423f67cdd083ab23e (root) ConsistencyTestingToolState / engine-festival-want-custom 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wear-height-payment-oyster 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf -1583837310948035076 /3 exact-doctor-apology-alter 4 StringLeaf 435 /4 caught-knife-shiver-candy
node2 5m 2.402s 2025-09-24 20:30:02.547 5197 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+25+16.619264354Z_seq0_minr1_maxr501_orgn0.pces
node2 5m 2.403s 2025-09-24 20:30:02.548 5198 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 409 File: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+25+16.619264354Z_seq0_minr1_maxr501_orgn0.pces
node2 5m 2.403s 2025-09-24 20:30:02.548 5199 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 5m 2.410s 2025-09-24 20:30:02.555 5200 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 5m 2.411s 2025-09-24 20:30:02.556 5201 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 436 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/436 {"round":436,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/436/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 5m 2.413s 2025-09-24 20:30:02.558 5202 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2
node1 5m 2.435s 2025-09-24 20:30:02.580 5173 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 436 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/436
node1 5m 2.436s 2025-09-24 20:30:02.581 5174 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 436
node1 5m 2.521s 2025-09-24 20:30:02.666 5205 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 436
node1 5m 2.523s 2025-09-24 20:30:02.668 5206 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 436 Timestamp: 2025-09-24T20:30:00.677717Z Next consensus number: 10089 Legacy running event hash: 920ac3f509c17ab6df59ebb15d495ef0aee74c4c93111dc169db4926cb61b9bda94492d3031ee9365009818d15073bc6 Legacy running event mnemonic: salute-lift-deal-fine Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1777943994 Root hash: adada557804909b40c93179534a92d59ea66fdc9996d174e3a0102d131f7a4e5dd5ecd1ce5f6a13423f67cdd083ab23e (root) ConsistencyTestingToolState / engine-festival-want-custom 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wear-height-payment-oyster 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf -1583837310948035076 /3 exact-doctor-apology-alter 4 StringLeaf 435 /4 caught-knife-shiver-candy
node1 5m 2.530s 2025-09-24 20:30:02.675 5207 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+25+16.237543055Z_seq0_minr1_maxr501_orgn0.pces
node1 5m 2.530s 2025-09-24 20:30:02.675 5208 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 409 File: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+25+16.237543055Z_seq0_minr1_maxr501_orgn0.pces
node1 5m 2.530s 2025-09-24 20:30:02.675 5209 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 5m 2.537s 2025-09-24 20:30:02.682 5210 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 5m 2.538s 2025-09-24 20:30:02.683 5211 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 436 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/436 {"round":436,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/436/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 5m 2.540s 2025-09-24 20:30:02.685 5212 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2
node0 5m 2.633s 2025-09-24 20:30:02.778 5189 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 436 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/436
node0 5m 2.633s 2025-09-24 20:30:02.778 5190 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 436
node0 5m 2.716s 2025-09-24 20:30:02.861 5229 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 436
node0 5m 2.718s 2025-09-24 20:30:02.863 5230 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 436 Timestamp: 2025-09-24T20:30:00.677717Z Next consensus number: 10089 Legacy running event hash: 920ac3f509c17ab6df59ebb15d495ef0aee74c4c93111dc169db4926cb61b9bda94492d3031ee9365009818d15073bc6 Legacy running event mnemonic: salute-lift-deal-fine Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1777943994 Root hash: adada557804909b40c93179534a92d59ea66fdc9996d174e3a0102d131f7a4e5dd5ecd1ce5f6a13423f67cdd083ab23e (root) ConsistencyTestingToolState / engine-festival-want-custom 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wear-height-payment-oyster 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf -1583837310948035076 /3 exact-doctor-apology-alter 4 StringLeaf 435 /4 caught-knife-shiver-candy
node0 5m 2.724s 2025-09-24 20:30:02.869 5231 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+25+16.311647372Z_seq0_minr1_maxr501_orgn0.pces
node0 5m 2.724s 2025-09-24 20:30:02.869 5232 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 409 File: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+25+16.311647372Z_seq0_minr1_maxr501_orgn0.pces
node0 5m 2.724s 2025-09-24 20:30:02.869 5233 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 5m 2.732s 2025-09-24 20:30:02.877 5234 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 5m 2.733s 2025-09-24 20:30:02.878 5235 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 436 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/436 {"round":436,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/436/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 5m 2.734s 2025-09-24 20:30:02.879 5236 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2
node4 5m 50.320s 2025-09-24 20:30:50.465 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 5m 50.407s 2025-09-24 20:30:50.552 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 5m 50.422s 2025-09-24 20:30:50.567 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 50.533s 2025-09-24 20:30:50.678 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 5m 50.539s 2025-09-24 20:30:50.684 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 5m 50.551s 2025-09-24 20:30:50.696 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 50.967s 2025-09-24 20:30:51.112 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 5m 50.967s 2025-09-24 20:30:51.112 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 5m 51.971s 2025-09-24 20:30:52.116 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1003ms
node4 5m 51.979s 2025-09-24 20:30:52.124 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 5m 51.982s 2025-09-24 20:30:52.127 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 52.018s 2025-09-24 20:30:52.163 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 5m 52.074s 2025-09-24 20:30:52.219 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 5m 52.075s 2025-09-24 20:30:52.220 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 5m 54.086s 2025-09-24 20:30:54.231 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 5m 54.168s 2025-09-24 20:30:54.313 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 54.175s 2025-09-24 20:30:54.320 21 INFO STARTUP <main> StartupStateUtils: The following saved states were found on disk:
- /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/260/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/167/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/74/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2/SignedState.swh
node4 5m 54.175s 2025-09-24 20:30:54.320 22 INFO STARTUP <main> StartupStateUtils: Loading latest state from disk.
node4 5m 54.175s 2025-09-24 20:30:54.320 23 INFO STARTUP <main> StartupStateUtils: Loading signed state from disk: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/260/SignedState.swh
node4 5m 54.179s 2025-09-24 20:30:54.324 24 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 54.184s 2025-09-24 20:30:54.329 25 INFO STATE_TO_DISK <main> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp
node4 5m 54.315s 2025-09-24 20:30:54.460 36 INFO STARTUP <main> StartupStateUtils: Loaded state's hash is the same as when it was saved.
node4 5m 54.318s 2025-09-24 20:30:54.463 37 INFO STARTUP <main> StartupStateUtils: Platform has loaded a saved state {"round":260,"consensusTimestamp":"2025-09-24T20:28:00.045236907Z"} [com.swirlds.logging.legacy.payload.SavedStateLoadedPayload]
node4 5m 54.320s 2025-09-24 20:30:54.465 40 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 54.322s 2025-09-24 20:30:54.467 43 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 5m 54.324s 2025-09-24 20:30:54.469 44 INFO STARTUP <main> AddressBookInitializer: Using the loaded state's address book and weight values.
node4 5m 54.331s 2025-09-24 20:30:54.476 45 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 54.332s 2025-09-24 20:30:54.477 46 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 55.364s 2025-09-24 20:30:55.509 47 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26313269] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=234710, randomLong=-4166617050475067736, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=43040, randomLong=4427165252759531243, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1061070, data=35, exception=null] OS Health Check Report - Complete (took 1019 ms)
node4 5m 55.391s 2025-09-24 20:30:55.536 48 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 5m 55.481s 2025-09-24 20:30:55.626 49 INFO STARTUP <main> PcesUtilities: Span compaction completed for data/saved/preconsensus-events/4/2025/09/24/2025-09-24T20+25+16.448491026Z_seq0_minr1_maxr501_orgn0.pces, new upper bound is 276
node4 5m 55.484s 2025-09-24 20:30:55.629 50 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 5m 55.489s 2025-09-24 20:30:55.634 51 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 5m 55.561s 2025-09-24 20:30:55.706 52 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ijy81A==", "port": 30124 }, { "ipAddressV4": "CoAAIg==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IhwXsQ==", "port": 30125 }, { "ipAddressV4": "CoAAFA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "Ih13kg==", "port": 30126 }, { "ipAddressV4": "CoAAKQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Iq1CrQ==", "port": 30127 }, { "ipAddressV4": "CoAAGA==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I7gDDA==", "port": 30128 }, { "ipAddressV4": "CoAAHA==", "port": 30128 }] }] }
node4 5m 55.581s 2025-09-24 20:30:55.726 53 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with state long 3744925396723063960.
node4 5m 55.582s 2025-09-24 20:30:55.727 54 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with 259 rounds handled.
node4 5m 55.582s 2025-09-24 20:30:55.727 55 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 5m 55.583s 2025-09-24 20:30:55.728 56 INFO STARTUP <main> TransactionHandlingHistory: Log file found. Parsing previous history
node4 5m 56.342s 2025-09-24 20:30:56.487 57 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 260 Timestamp: 2025-09-24T20:28:00.045236907Z Next consensus number: 6832 Legacy running event hash: c81ffe55f19259314fd6ee1f96d133d05572d4acdb3b66d1e8e79af6abfba5f68efe2d3b084cf34bec5250edddf27ee3 Legacy running event mnemonic: wire-cube-soccer-change Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1845518224 Root hash: 352ea036ff0daac342e6efe71167c2593c5fc788cf491991714bbc630940f9ca628e919135ba038237256405d7ec1072 (root) ConsistencyTestingToolState / place-welcome-bunker-police 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wool-mad-laptop-tip 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 3744925396723063960 /3 pretty-motor-can-dash 4 StringLeaf 259 /4 chronic-core-repair-exotic
node4 5m 56.569s 2025-09-24 20:30:56.714 59 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: c81ffe55f19259314fd6ee1f96d133d05572d4acdb3b66d1e8e79af6abfba5f68efe2d3b084cf34bec5250edddf27ee3
node4 5m 56.583s 2025-09-24 20:30:56.728 60 INFO STARTUP <platformForkJoinThread-4> Shadowgraph: Shadowgraph starting from expiration threshold 233
node4 5m 56.591s 2025-09-24 20:30:56.736 62 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 5m 56.592s 2025-09-24 20:30:56.737 63 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 5m 56.594s 2025-09-24 20:30:56.739 64 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 5m 56.597s 2025-09-24 20:30:56.742 65 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 5m 56.599s 2025-09-24 20:30:56.744 66 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 5m 56.599s 2025-09-24 20:30:56.744 67 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 5m 56.602s 2025-09-24 20:30:56.747 68 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 233
node4 5m 56.607s 2025-09-24 20:30:56.752 69 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 175.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 5m 56.820s 2025-09-24 20:30:56.965 70 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:62ac193fd3c0 BR:258), num remaining: 4
node4 5m 56.821s 2025-09-24 20:30:56.966 71 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:4 H:f04c31519bc8 BR:259), num remaining: 3
node4 5m 56.822s 2025-09-24 20:30:56.967 72 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:4b6b6d0ac732 BR:258), num remaining: 2
node4 5m 56.822s 2025-09-24 20:30:56.967 73 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:f3d2e80dbcc8 BR:259), num remaining: 1
node4 5m 56.823s 2025-09-24 20:30:56.968 74 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:f0ae343ba0a6 BR:259), num remaining: 0
node4 5m 56.917s 2025-09-24 20:30:57.062 121 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 1,165 preconsensus events with max birth round 276. These events contained 3,012 transactions. 15 rounds reached consensus spanning 9.9 seconds of consensus time. The latest round to reach consensus is round 275. Replay took 313.0 milliseconds.
node4 5m 56.920s 2025-09-24 20:30:57.065 123 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 309.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 5m 56.920s 2025-09-24 20:30:57.065 124 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 5m 57.859s 2025-09-24 20:30:58.004 211 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 938.0 ms in OBSERVING. Now in BEHIND
node4 5m 57.860s 2025-09-24 20:30:58.005 212 INFO RECONNECT <platformForkJoinThread-8> ReconnectController: Starting ReconnectController
node4 5m 57.861s 2025-09-24 20:30:58.006 213 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, stopping gossip
node4 5m 57.861s 2025-09-24 20:30:58.006 214 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, start clearing queues
node4 5m 57.862s 2025-09-24 20:30:58.007 215 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Queues have been cleared
node4 5m 57.864s 2025-09-24 20:30:58.009 216 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: waiting for reconnect connection
node4 5m 57.864s 2025-09-24 20:30:58.009 217 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: acquired reconnect connection
node3 5m 58.099s 2025-09-24 20:30:58.244 6164 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Starting reconnect in the role of the sender {"receiving":false,"nodeId":3,"otherNodeId":4,"round":514} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node3 5m 58.100s 2025-09-24 20:30:58.245 6165 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: The following state will be sent to the learner:
Round: 514 Timestamp: 2025-09-24T20:30:55.695126Z Next consensus number: 11484 Legacy running event hash: 776fec74f636b83a88247c5c74c6ee078ecf2a1dc3c0bc0bd70909f18390f35152d5795e8549d1dc1f0908e685aeb415 Legacy running event mnemonic: unknown-immense-deer-web Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1923318400 Root hash: a945f85d3622f6fb35af94aefef265aa6b627ac2d9838d51bf858e85a15895b8f898215182ba470e20517b9b1ff184af (root) ConsistencyTestingToolState / student-virtual-session-camera 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 silk-major-crew-hybrid 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf -6381862706355181899 /3 clown-eagle-pyramid-whisper 4 StringLeaf 513 /4 logic-piece-satoshi-census
node3 5m 58.100s 2025-09-24 20:30:58.245 6166 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Sending signatures from nodes 0, 1, 3 (signing weight = 37500000000/50000000000) for state hash a945f85d3622f6fb35af94aefef265aa6b627ac2d9838d51bf858e85a15895b8f898215182ba470e20517b9b1ff184af
node3 5m 58.100s 2025-09-24 20:30:58.245 6167 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Starting synchronization in the role of the sender.
node3 5m 58.104s 2025-09-24 20:30:58.249 6168 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node3 5m 58.112s 2025-09-24 20:30:58.257 6169 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@4c50ec38 start run()
node4 5m 58.164s 2025-09-24 20:30:58.309 218 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Starting reconnect in role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":3,"round":274} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node4 5m 58.166s 2025-09-24 20:30:58.311 219 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Receiving signed state signatures
node4 5m 58.171s 2025-09-24 20:30:58.316 220 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Received signatures from nodes 0, 1, 3
node4 5m 58.174s 2025-09-24 20:30:58.319 221 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls receiveTree()
node4 5m 58.174s 2025-09-24 20:30:58.319 222 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronizing tree
node4 5m 58.175s 2025-09-24 20:30:58.320 223 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 5m 58.180s 2025-09-24 20:30:58.325 224 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@51bed345 start run()
node4 5m 58.183s 2025-09-24 20:30:58.328 225 INFO STARTUP <<work group learning-synchronizer: async-input-stream #0>> ConsistencyTestingToolState: New State Constructed.
node3 5m 58.265s 2025-09-24 20:30:58.410 6188 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@4c50ec38 finish run()
node3 5m 58.266s 2025-09-24 20:30:58.411 6189 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> TeachingSynchronizer: finished sending tree
node3 5m 58.266s 2025-09-24 20:30:58.411 6190 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node3 5m 58.267s 2025-09-24 20:30:58.412 6191 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@150dba43 start run()
node4 5m 58.372s 2025-09-24 20:30:58.517 249 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 5m 58.372s 2025-09-24 20:30:58.517 250 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 5m 58.373s 2025-09-24 20:30:58.518 251 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@51bed345 finish run()
node4 5m 58.374s 2025-09-24 20:30:58.519 252 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 5m 58.374s 2025-09-24 20:30:58.519 253 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 5m 58.377s 2025-09-24 20:30:58.522 254 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@52cf5ccf start run()
node4 5m 58.444s 2025-09-24 20:30:58.589 255 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): firstLeafPath: 1 -> 1, lastLeafPath: 1 -> 1
node4 5m 58.445s 2025-09-24 20:30:58.590 256 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): done
node4 5m 58.447s 2025-09-24 20:30:58.592 257 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 5m 58.448s 2025-09-24 20:30:58.593 258 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call nodeRemover.allNodesReceived()
node4 5m 58.448s 2025-09-24 20:30:58.593 259 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived()
node4 5m 58.448s 2025-09-24 20:30:58.593 260 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived(): done
node4 5m 58.449s 2025-09-24 20:30:58.594 261 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call root.endLearnerReconnect()
node4 5m 58.449s 2025-09-24 20:30:58.594 262 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call reconnectIterator.close()
node4 5m 58.449s 2025-09-24 20:30:58.594 263 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call setHashPrivate()
node3 5m 58.517s 2025-09-24 20:30:58.662 6192 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@150dba43 finish run()
node3 5m 58.518s 2025-09-24 20:30:58.663 6193 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> TeachingSynchronizer: finished sending tree
node3 5m 58.521s 2025-09-24 20:30:58.666 6196 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Finished synchronization in the role of the sender.
node4 5m 58.607s 2025-09-24 20:30:58.752 273 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call postInit()
node4 5m 58.608s 2025-09-24 20:30:58.753 275 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: endLearnerReconnect() complete
node4 5m 58.608s 2025-09-24 20:30:58.753 276 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: close() complete
node4 5m 58.608s 2025-09-24 20:30:58.753 277 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 5m 58.609s 2025-09-24 20:30:58.754 278 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@52cf5ccf finish run()
node4 5m 58.610s 2025-09-24 20:30:58.755 279 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 5m 58.610s 2025-09-24 20:30:58.755 280 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronization complete
node4 5m 58.610s 2025-09-24 20:30:58.755 281 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls initialize()
node4 5m 58.611s 2025-09-24 20:30:58.756 282 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initializing tree
node4 5m 58.611s 2025-09-24 20:30:58.756 283 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initialization complete
node4 5m 58.611s 2025-09-24 20:30:58.756 284 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls hash()
node4 5m 58.611s 2025-09-24 20:30:58.756 285 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing tree
node4 5m 58.612s 2025-09-24 20:30:58.757 286 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing complete
node4 5m 58.612s 2025-09-24 20:30:58.757 287 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls logStatistics()
node4 5m 58.615s 2025-09-24 20:30:58.760 288 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: Finished synchronization {"timeInSeconds":0.435,"hashTimeInSeconds":0.0,"initializationTimeInSeconds":0.0,"totalNodes":12,"leafNodes":7,"redundantLeafNodes":4,"internalNodes":5,"redundantInternalNodes":2} [com.swirlds.logging.legacy.payload.SynchronizationCompletePayload]
node4 5m 58.616s 2025-09-24 20:30:58.761 289 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: ReconnectMapMetrics: transfersFromTeacher=12; transfersFromLearner=10; internalHashes=0; internalCleanHashes=0; internalData=0; internalCleanData=0; leafHashes=3; leafCleanHashes=3; leafData=7; leafCleanData=4
node4 5m 58.616s 2025-09-24 20:30:58.761 290 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner is done synchronizing
node4 5m 58.619s 2025-09-24 20:30:58.764 291 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Reconnect data usage report {"dataMegabytes":0.0060558319091796875} [com.swirlds.logging.legacy.payload.ReconnectDataUsagePayload]
node4 5m 58.622s 2025-09-24 20:30:58.767 292 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Finished reconnect in the role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":3,"round":514,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 5m 58.623s 2025-09-24 20:30:58.768 293 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Information for state received during reconnect:
Round: 514 Timestamp: 2025-09-24T20:30:55.695126Z Next consensus number: 11484 Legacy running event hash: 776fec74f636b83a88247c5c74c6ee078ecf2a1dc3c0bc0bd70909f18390f35152d5795e8549d1dc1f0908e685aeb415 Legacy running event mnemonic: unknown-immense-deer-web Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1923318400 Root hash: a945f85d3622f6fb35af94aefef265aa6b627ac2d9838d51bf858e85a15895b8f898215182ba470e20517b9b1ff184af (root) ConsistencyTestingToolState / student-virtual-session-camera 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 silk-major-crew-hybrid 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf -6381862706355181899 /3 clown-eagle-pyramid-whisper 4 StringLeaf 513 /4 logic-piece-satoshi-census
node4 5m 58.624s 2025-09-24 20:30:58.769 295 DEBUG RECONNECT <<reconnect: reconnect-controller>> ReconnectStateLoader: `loadReconnectState` : reloading state
node4 5m 58.624s 2025-09-24 20:30:58.769 296 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with state long -6381862706355181899.
node4 5m 58.624s 2025-09-24 20:30:58.769 297 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with 513 rounds handled.
node4 5m 58.624s 2025-09-24 20:30:58.769 298 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 5m 58.625s 2025-09-24 20:30:58.770 299 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Log file found. Parsing previous history
node4 5m 58.645s 2025-09-24 20:30:58.790 306 INFO STATE_TO_DISK <<reconnect: reconnect-controller>> DefaultSavedStateController: Signed state from round 514 created, will eventually be written to disk, for reason: RECONNECT
node4 5m 58.645s 2025-09-24 20:30:58.790 307 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 785.0 ms in BEHIND. Now in RECONNECT_COMPLETE
node4 5m 58.646s 2025-09-24 20:30:58.791 309 INFO STARTUP <platformForkJoinThread-7> Shadowgraph: Shadowgraph starting from expiration threshold 487
node4 5m 58.648s 2025-09-24 20:30:58.793 311 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 514 state to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/514
node4 5m 58.649s 2025-09-24 20:30:58.794 312 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/2 for round 514
node4 5m 58.652s 2025-09-24 20:30:58.797 313 INFO EVENT_STREAM <<reconnect: reconnect-controller>> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 776fec74f636b83a88247c5c74c6ee078ecf2a1dc3c0bc0bd70909f18390f35152d5795e8549d1dc1f0908e685aeb415
node4 5m 58.653s 2025-09-24 20:30:58.798 315 INFO STARTUP <platformForkJoinThread-7> PcesFileManager: Due to recent operations on this node, the local preconsensus event stream will have a discontinuity. The last file with the old origin round is 2025-09-24T20+25+16.448491026Z_seq0_minr1_maxr276_orgn0.pces. All future files will have an origin round of 514.
node3 5m 58.692s 2025-09-24 20:30:58.837 6200 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Finished reconnect in the role of the sender. {"receiving":false,"nodeId":3,"otherNodeId":4,"round":514,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 5m 58.793s 2025-09-24 20:30:58.938 346 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/2 for round 514
node4 5m 58.796s 2025-09-24 20:30:58.941 347 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 514 Timestamp: 2025-09-24T20:30:55.695126Z Next consensus number: 11484 Legacy running event hash: 776fec74f636b83a88247c5c74c6ee078ecf2a1dc3c0bc0bd70909f18390f35152d5795e8549d1dc1f0908e685aeb415 Legacy running event mnemonic: unknown-immense-deer-web Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1923318400 Root hash: a945f85d3622f6fb35af94aefef265aa6b627ac2d9838d51bf858e85a15895b8f898215182ba470e20517b9b1ff184af (root) ConsistencyTestingToolState / student-virtual-session-camera 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 silk-major-crew-hybrid 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf -6381862706355181899 /3 clown-eagle-pyramid-whisper 4 StringLeaf 513 /4 logic-piece-satoshi-census
node4 5m 58.839s 2025-09-24 20:30:58.984 359 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/24/2025-09-24T20+25+16.448491026Z_seq0_minr1_maxr276_orgn0.pces
node4 5m 58.840s 2025-09-24 20:30:58.985 360 WARN STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: No preconsensus event files meeting specified criteria found to copy. Lower bound: 487
node4 5m 58.848s 2025-09-24 20:30:58.993 361 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 514 to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/514 {"round":514,"freezeState":false,"reason":"RECONNECT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/514/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 5m 58.851s 2025-09-24 20:30:58.996 362 INFO PLATFORM_STATUS <platformForkJoinThread-5> DefaultStatusStateMachine: Platform spent 205.0 ms in RECONNECT_COMPLETE. Now in CHECKING
node4 5m 59.515s 2025-09-24 20:30:59.660 363 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:c9e00d7e3be8 BR:512), num remaining: 3
node4 5m 59.517s 2025-09-24 20:30:59.662 364 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:1a0476d53ac3 BR:512), num remaining: 2
node4 5m 59.517s 2025-09-24 20:30:59.662 365 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:96bc1c618312 BR:512), num remaining: 1
node4 5m 59.520s 2025-09-24 20:30:59.665 366 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:957aee81bda7 BR:512), num remaining: 0
node4 5m 59.603s 2025-09-24 20:30:59.748 389 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 5m 59.606s 2025-09-24 20:30:59.751 390 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node2 6m 2.410s 2025-09-24 20:31:02.555 6184 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 521 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 6m 2.457s 2025-09-24 20:31:02.602 6188 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 521 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 6m 2.525s 2025-09-24 20:31:02.670 6259 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 521 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 6m 2.541s 2025-09-24 20:31:02.686 430 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 521 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 6m 2.689s 2025-09-24 20:31:02.834 6214 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 521 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 6m 2.754s 2025-09-24 20:31:02.899 6217 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 521 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/521
node0 6m 2.755s 2025-09-24 20:31:02.900 6218 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 521
node0 6m 2.841s 2025-09-24 20:31:02.986 6253 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 521
node0 6m 2.843s 2025-09-24 20:31:02.988 6254 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 521 Timestamp: 2025-09-24T20:31:00.692633Z Next consensus number: 11612 Legacy running event hash: 7d9c4a49e52335d46d0d4fd5fee9fb317ef6fa6db3ec3a5dc4bdb3fa77b1ef3bc1f28980bcc6236c34846e173c3a5447 Legacy running event mnemonic: receive-empty-forum-ritual Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1039521 Root hash: 9affddfba01fa95f26be9e761f6e79546813c3283c50ffe6aec07c584b3c2a48cdcf977068668443737f250ea7af18d8 (root) ConsistencyTestingToolState / parent-doll-wire-stage 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 baby-trap-oppose-modify 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 8052865799524397596 /3 expand-way-twenty-grid 4 StringLeaf 520 /4 lonely-clutch-crop-congress
node0 6m 2.849s 2025-09-24 20:31:02.994 6255 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+25+16.311647372Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+30+48.108361244Z_seq1_minr474_maxr5474_orgn0.pces
node0 6m 2.849s 2025-09-24 20:31:02.994 6256 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus event files meeting specified criteria to copy.
Lower bound: 494 First file to copy: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+25+16.311647372Z_seq0_minr1_maxr501_orgn0.pces Last file to copy: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+30+48.108361244Z_seq1_minr474_maxr5474_orgn0.pces
node0 6m 2.849s 2025-09-24 20:31:02.994 6257 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 2 preconsensus event file(s)
node0 6m 2.857s 2025-09-24 20:31:03.002 6258 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 2 preconsensus event file(s)
node0 6m 2.858s 2025-09-24 20:31:03.003 6259 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 521 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/521 {"round":521,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/521/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 6m 2.859s 2025-09-24 20:31:03.004 6260 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/74
node4 6m 2.861s 2025-09-24 20:31:03.006 432 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 521 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/521
node4 6m 2.862s 2025-09-24 20:31:03.007 433 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 521
node2 6m 2.939s 2025-09-24 20:31:03.084 6190 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 521 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/521
node2 6m 2.940s 2025-09-24 20:31:03.085 6191 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 521
node3 6m 2.940s 2025-09-24 20:31:03.085 6265 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 521 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/521
node3 6m 2.942s 2025-09-24 20:31:03.087 6266 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/38 for round 521
node4 6m 2.963s 2025-09-24 20:31:03.108 467 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 521
node4 6m 2.965s 2025-09-24 20:31:03.110 468 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 521 Timestamp: 2025-09-24T20:31:00.692633Z Next consensus number: 11612 Legacy running event hash: 7d9c4a49e52335d46d0d4fd5fee9fb317ef6fa6db3ec3a5dc4bdb3fa77b1ef3bc1f28980bcc6236c34846e173c3a5447 Legacy running event mnemonic: receive-empty-forum-ritual Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1039521 Root hash: 9affddfba01fa95f26be9e761f6e79546813c3283c50ffe6aec07c584b3c2a48cdcf977068668443737f250ea7af18d8 (root) ConsistencyTestingToolState / parent-doll-wire-stage 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 baby-trap-oppose-modify 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 8052865799524397596 /3 expand-way-twenty-grid 4 StringLeaf 520 /4 lonely-clutch-crop-congress
node4 6m 2.972s 2025-09-24 20:31:03.117 469 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/09/24/2025-09-24T20+30+59.094719486Z_seq1_minr487_maxr987_orgn514.pces Last file: data/saved/preconsensus-events/4/2025/09/24/2025-09-24T20+25+16.448491026Z_seq0_minr1_maxr276_orgn0.pces
node4 6m 2.972s 2025-09-24 20:31:03.117 470 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 494 File: data/saved/preconsensus-events/4/2025/09/24/2025-09-24T20+30+59.094719486Z_seq1_minr487_maxr987_orgn514.pces
node4 6m 2.972s 2025-09-24 20:31:03.117 471 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 6m 2.974s 2025-09-24 20:31:03.119 472 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 6m 2.974s 2025-09-24 20:31:03.119 473 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 521 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/521 {"round":521,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/521/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 2.976s 2025-09-24 20:31:03.121 474 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2
node1 6m 3.010s 2025-09-24 20:31:03.155 6194 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 521 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/521
node1 6m 3.011s 2025-09-24 20:31:03.156 6195 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 521
node3 6m 3.024s 2025-09-24 20:31:03.169 6307 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/38 for round 521
node3 6m 3.026s 2025-09-24 20:31:03.171 6308 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 521 Timestamp: 2025-09-24T20:31:00.692633Z Next consensus number: 11612 Legacy running event hash: 7d9c4a49e52335d46d0d4fd5fee9fb317ef6fa6db3ec3a5dc4bdb3fa77b1ef3bc1f28980bcc6236c34846e173c3a5447 Legacy running event mnemonic: receive-empty-forum-ritual Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1039521 Root hash: 9affddfba01fa95f26be9e761f6e79546813c3283c50ffe6aec07c584b3c2a48cdcf977068668443737f250ea7af18d8 (root) ConsistencyTestingToolState / parent-doll-wire-stage 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 baby-trap-oppose-modify 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 8052865799524397596 /3 expand-way-twenty-grid 4 StringLeaf 520 /4 lonely-clutch-crop-congress
node2 6m 3.032s 2025-09-24 20:31:03.177 6231 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 521
node3 6m 3.032s 2025-09-24 20:31:03.177 6309 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+25+16.374322506Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+30+48.077293515Z_seq1_minr474_maxr5474_orgn0.pces
node3 6m 3.032s 2025-09-24 20:31:03.177 6310 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus event files meeting specified criteria to copy.
Lower bound: 494 First file to copy: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+25+16.374322506Z_seq0_minr1_maxr501_orgn0.pces Last file to copy: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+30+48.077293515Z_seq1_minr474_maxr5474_orgn0.pces
node3 6m 3.032s 2025-09-24 20:31:03.177 6311 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 2 preconsensus event file(s)
node2 6m 3.034s 2025-09-24 20:31:03.179 6232 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 521 Timestamp: 2025-09-24T20:31:00.692633Z Next consensus number: 11612 Legacy running event hash: 7d9c4a49e52335d46d0d4fd5fee9fb317ef6fa6db3ec3a5dc4bdb3fa77b1ef3bc1f28980bcc6236c34846e173c3a5447 Legacy running event mnemonic: receive-empty-forum-ritual Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1039521 Root hash: 9affddfba01fa95f26be9e761f6e79546813c3283c50ffe6aec07c584b3c2a48cdcf977068668443737f250ea7af18d8 (root) ConsistencyTestingToolState / parent-doll-wire-stage 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 baby-trap-oppose-modify 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 8052865799524397596 /3 expand-way-twenty-grid 4 StringLeaf 520 /4 lonely-clutch-crop-congress
node3 6m 3.041s 2025-09-24 20:31:03.186 6312 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 2 preconsensus event file(s)
node3 6m 3.041s 2025-09-24 20:31:03.186 6313 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 521 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/521 {"round":521,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/521/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 6m 3.042s 2025-09-24 20:31:03.187 6233 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+30+48.132100330Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+25+16.619264354Z_seq0_minr1_maxr501_orgn0.pces
node2 6m 3.043s 2025-09-24 20:31:03.188 6234 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus event files meeting specified criteria to copy.
Lower bound: 494 First file to copy: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+25+16.619264354Z_seq0_minr1_maxr501_orgn0.pces Last file to copy: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+30+48.132100330Z_seq1_minr474_maxr5474_orgn0.pces
node2 6m 3.043s 2025-09-24 20:31:03.188 6235 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 2 preconsensus event file(s)
node3 6m 3.043s 2025-09-24 20:31:03.188 6314 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/74
node2 6m 3.052s 2025-09-24 20:31:03.197 6236 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 2 preconsensus event file(s)
node2 6m 3.052s 2025-09-24 20:31:03.197 6237 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 521 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/521 {"round":521,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/521/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 6m 3.054s 2025-09-24 20:31:03.199 6238 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/74
node4 6m 3.064s 2025-09-24 20:31:03.209 484 INFO PLATFORM_STATUS <platformForkJoinThread-2> DefaultStatusStateMachine: Platform spent 4.2 s in CHECKING. Now in ACTIVE
node1 6m 3.089s 2025-09-24 20:31:03.234 6236 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 521
node1 6m 3.091s 2025-09-24 20:31:03.236 6237 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 521 Timestamp: 2025-09-24T20:31:00.692633Z Next consensus number: 11612 Legacy running event hash: 7d9c4a49e52335d46d0d4fd5fee9fb317ef6fa6db3ec3a5dc4bdb3fa77b1ef3bc1f28980bcc6236c34846e173c3a5447 Legacy running event mnemonic: receive-empty-forum-ritual Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1039521 Root hash: 9affddfba01fa95f26be9e761f6e79546813c3283c50ffe6aec07c584b3c2a48cdcf977068668443737f250ea7af18d8 (root) ConsistencyTestingToolState / parent-doll-wire-stage 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 baby-trap-oppose-modify 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 8052865799524397596 /3 expand-way-twenty-grid 4 StringLeaf 520 /4 lonely-clutch-crop-congress
node1 6m 3.096s 2025-09-24 20:31:03.241 6238 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+25+16.237543055Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+30+48.222175017Z_seq1_minr474_maxr5474_orgn0.pces
node1 6m 3.097s 2025-09-24 20:31:03.242 6239 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus event files meeting specified criteria to copy.
Lower bound: 494 First file to copy: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+25+16.237543055Z_seq0_minr1_maxr501_orgn0.pces Last file to copy: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+30+48.222175017Z_seq1_minr474_maxr5474_orgn0.pces
node1 6m 3.097s 2025-09-24 20:31:03.242 6240 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 2 preconsensus event file(s)
node1 6m 3.105s 2025-09-24 20:31:03.250 6241 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 2 preconsensus event file(s)
node1 6m 3.106s 2025-09-24 20:31:03.251 6242 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 521 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/521 {"round":521,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/521/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 6m 3.107s 2025-09-24 20:31:03.252 6243 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/74
node2 7m 1.880s 2025-09-24 20:32:02.025 7320 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 620 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 7m 2.038s 2025-09-24 20:32:02.183 1563 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 620 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 7m 2.167s 2025-09-24 20:32:02.312 7314 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 620 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 7m 2.205s 2025-09-24 20:32:02.350 7368 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 620 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 7m 2.224s 2025-09-24 20:32:02.369 7407 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 620 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 7m 2.616s 2025-09-24 20:32:02.761 7413 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 620 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/620
node3 7m 2.618s 2025-09-24 20:32:02.763 7414 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/44 for round 620
node1 7m 2.627s 2025-09-24 20:32:02.772 7320 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 620 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/620
node1 7m 2.627s 2025-09-24 20:32:02.772 7321 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 620
node0 7m 2.686s 2025-09-24 20:32:02.831 7374 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 620 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/620
node0 7m 2.687s 2025-09-24 20:32:02.832 7375 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 620
node2 7m 2.697s 2025-09-24 20:32:02.842 7326 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 620 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/620
node2 7m 2.698s 2025-09-24 20:32:02.843 7327 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 620
node3 7m 2.705s 2025-09-24 20:32:02.850 7459 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/44 for round 620
node3 7m 2.707s 2025-09-24 20:32:02.852 7460 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 620 Timestamp: 2025-09-24T20:32:00.265223738Z Next consensus number: 14088 Legacy running event hash: 7db65ba4f884a9ad212e7434af41db2aab42cfff00d224e6a354487d11b2510c02f025c07ac5d6dcb70de56f936a6c53 Legacy running event mnemonic: replace-innocent-usage-skill Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1397905226 Root hash: ad684d2663666bf7ab6c4d8fb700e7dbaf3b7fd69c99bc321040d281026d9c0727311d12419db15a8e297e2e316462ab (root) ConsistencyTestingToolState / fantasy-pill-holiday-leave 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 foil-hen-recipe-ticket 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 5735427662034108991 /3 brand-quarter-cart-faculty 4 StringLeaf 619 /4 assist-old-dumb-food
node1 7m 2.711s 2025-09-24 20:32:02.856 7357 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 620
node1 7m 2.713s 2025-09-24 20:32:02.858 7358 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 620 Timestamp: 2025-09-24T20:32:00.265223738Z Next consensus number: 14088 Legacy running event hash: 7db65ba4f884a9ad212e7434af41db2aab42cfff00d224e6a354487d11b2510c02f025c07ac5d6dcb70de56f936a6c53 Legacy running event mnemonic: replace-innocent-usage-skill Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1397905226 Root hash: ad684d2663666bf7ab6c4d8fb700e7dbaf3b7fd69c99bc321040d281026d9c0727311d12419db15a8e297e2e316462ab (root) ConsistencyTestingToolState / fantasy-pill-holiday-leave 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 foil-hen-recipe-ticket 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 5735427662034108991 /3 brand-quarter-cart-faculty 4 StringLeaf 619 /4 assist-old-dumb-food
node3 7m 2.714s 2025-09-24 20:32:02.859 7461 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+25+16.374322506Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+30+48.077293515Z_seq1_minr474_maxr5474_orgn0.pces
node3 7m 2.715s 2025-09-24 20:32:02.860 7462 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 593 File: data/saved/preconsensus-events/3/2025/09/24/2025-09-24T20+30+48.077293515Z_seq1_minr474_maxr5474_orgn0.pces
node3 7m 2.715s 2025-09-24 20:32:02.860 7463 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 7m 2.717s 2025-09-24 20:32:02.862 7464 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 7m 2.717s 2025-09-24 20:32:02.862 7465 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 620 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/620 {"round":620,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/620/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 7m 2.719s 2025-09-24 20:32:02.864 7466 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/167
node1 7m 2.723s 2025-09-24 20:32:02.868 7359 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+25+16.237543055Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+30+48.222175017Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 2.723s 2025-09-24 20:32:02.868 7360 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 593 File: data/saved/preconsensus-events/1/2025/09/24/2025-09-24T20+30+48.222175017Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 2.723s 2025-09-24 20:32:02.868 7361 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 7m 2.725s 2025-09-24 20:32:02.870 7362 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 7m 2.726s 2025-09-24 20:32:02.871 7363 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 620 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/620 {"round":620,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/620/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 7m 2.727s 2025-09-24 20:32:02.872 7364 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/167
node0 7m 2.771s 2025-09-24 20:32:02.916 7416 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 620
node0 7m 2.773s 2025-09-24 20:32:02.918 7417 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 620 Timestamp: 2025-09-24T20:32:00.265223738Z Next consensus number: 14088 Legacy running event hash: 7db65ba4f884a9ad212e7434af41db2aab42cfff00d224e6a354487d11b2510c02f025c07ac5d6dcb70de56f936a6c53 Legacy running event mnemonic: replace-innocent-usage-skill Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1397905226 Root hash: ad684d2663666bf7ab6c4d8fb700e7dbaf3b7fd69c99bc321040d281026d9c0727311d12419db15a8e297e2e316462ab (root) ConsistencyTestingToolState / fantasy-pill-holiday-leave 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 foil-hen-recipe-ticket 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 5735427662034108991 /3 brand-quarter-cart-faculty 4 StringLeaf 619 /4 assist-old-dumb-food
node0 7m 2.783s 2025-09-24 20:32:02.928 7418 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+25+16.311647372Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+30+48.108361244Z_seq1_minr474_maxr5474_orgn0.pces
node0 7m 2.783s 2025-09-24 20:32:02.928 7419 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 593 File: data/saved/preconsensus-events/0/2025/09/24/2025-09-24T20+30+48.108361244Z_seq1_minr474_maxr5474_orgn0.pces
node0 7m 2.783s 2025-09-24 20:32:02.928 7420 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 7m 2.785s 2025-09-24 20:32:02.930 7421 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 7m 2.786s 2025-09-24 20:32:02.931 7422 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 620 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/620 {"round":620,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/620/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7m 2.787s 2025-09-24 20:32:02.932 7423 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/167
node2 7m 2.795s 2025-09-24 20:32:02.940 7368 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 620
node2 7m 2.798s 2025-09-24 20:32:02.943 7369 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 620 Timestamp: 2025-09-24T20:32:00.265223738Z Next consensus number: 14088 Legacy running event hash: 7db65ba4f884a9ad212e7434af41db2aab42cfff00d224e6a354487d11b2510c02f025c07ac5d6dcb70de56f936a6c53 Legacy running event mnemonic: replace-innocent-usage-skill Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1397905226 Root hash: ad684d2663666bf7ab6c4d8fb700e7dbaf3b7fd69c99bc321040d281026d9c0727311d12419db15a8e297e2e316462ab (root) ConsistencyTestingToolState / fantasy-pill-holiday-leave 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 foil-hen-recipe-ticket 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 5735427662034108991 /3 brand-quarter-cart-faculty 4 StringLeaf 619 /4 assist-old-dumb-food
node2 7m 2.808s 2025-09-24 20:32:02.953 7370 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+30+48.132100330Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+25+16.619264354Z_seq0_minr1_maxr501_orgn0.pces
node2 7m 2.808s 2025-09-24 20:32:02.953 7371 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 593 File: data/saved/preconsensus-events/2/2025/09/24/2025-09-24T20+30+48.132100330Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 2.809s 2025-09-24 20:32:02.954 7372 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 7m 2.811s 2025-09-24 20:32:02.956 7373 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 7m 2.811s 2025-09-24 20:32:02.956 7374 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 620 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/620 {"round":620,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/620/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 7m 2.813s 2025-09-24 20:32:02.958 7375 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/167
node4 7m 2.873s 2025-09-24 20:32:03.018 1569 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 620 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/620
node4 7m 2.874s 2025-09-24 20:32:03.019 1570 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/14 for round 620
node4 7m 2.979s 2025-09-24 20:32:03.124 1614 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/14 for round 620
node4 7m 2.981s 2025-09-24 20:32:03.126 1615 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 620 Timestamp: 2025-09-24T20:32:00.265223738Z Next consensus number: 14088 Legacy running event hash: 7db65ba4f884a9ad212e7434af41db2aab42cfff00d224e6a354487d11b2510c02f025c07ac5d6dcb70de56f936a6c53 Legacy running event mnemonic: replace-innocent-usage-skill Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1397905226 Root hash: ad684d2663666bf7ab6c4d8fb700e7dbaf3b7fd69c99bc321040d281026d9c0727311d12419db15a8e297e2e316462ab (root) ConsistencyTestingToolState / fantasy-pill-holiday-leave 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 foil-hen-recipe-ticket 1 SingletonNode RosterService.ROSTER_STATE /1 armed-pupil-delay-decline 2 VirtualMap RosterService.ROSTERS /2 boss-dolphin-cube-pilot 3 StringLeaf 5735427662034108991 /3 brand-quarter-cart-faculty 4 StringLeaf 619 /4 assist-old-dumb-food
node4 7m 2.988s 2025-09-24 20:32:03.133 1616 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/09/24/2025-09-24T20+30+59.094719486Z_seq1_minr487_maxr987_orgn514.pces Last file: data/saved/preconsensus-events/4/2025/09/24/2025-09-24T20+25+16.448491026Z_seq0_minr1_maxr276_orgn0.pces
node4 7m 2.989s 2025-09-24 20:32:03.134 1617 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 593 File: data/saved/preconsensus-events/4/2025/09/24/2025-09-24T20+30+59.094719486Z_seq1_minr487_maxr987_orgn514.pces
node4 7m 2.989s 2025-09-24 20:32:03.134 1618 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 7m 2.991s 2025-09-24 20:32:03.136 1619 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 7m 2.992s 2025-09-24 20:32:03.137 1620 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 620 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/620 {"round":620,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/620/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 7m 2.993s 2025-09-24 20:32:03.138 1621 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/74
node2 7m 54.441s 2025-09-24 20:32:54.586 8248 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 2 to 1>> NetworkUtils: Connection broken: 2 <- 1
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.WaitForAcceptReject.transition(WaitForAcceptReject.java:48) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node3 7m 54.441s 2025-09-24 20:32:54.586 8341 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 3 to 1>> NetworkUtils: Connection broken: 3 <- 1
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-24T20:32:54.585949310Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-24T20:32:54.585949310Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:180) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readBoolean(DataInputStream.java:255) at org.hiero.base.io.streams.AugmentedDataInputStream.readBoolean(AugmentedDataInputStream.java:137) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readMyTipsTheyHave$7(SyncUtils.java:163) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more
node0 7m 54.443s 2025-09-24 20:32:54.588 8294 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 0 to 1>> NetworkUtils: Connection broken: 0 -> 1
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-24T20:32:54.586943653Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-24T20:32:54.586943653Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:180) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readBoolean(DataInputStream.java:255) at org.hiero.base.io.streams.AugmentedDataInputStream.readBoolean(AugmentedDataInputStream.java:137) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readMyTipsTheyHave$7(SyncUtils.java:163) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more
node2 7m 54.528s 2025-09-24 20:32:54.673 8249 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 2 to 4>> NetworkUtils: Connection broken: 2 -> 4
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.WaitForAcceptReject.transition(WaitForAcceptReject.java:48) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node0 7m 54.529s 2025-09-24 20:32:54.674 8295 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 0 to 4>> NetworkUtils: Connection broken: 0 -> 4
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-24T20:32:54.674286213Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-24T20:32:54.674286213Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readLong(DataInputStream.java:407) at org.hiero.base.io.streams.AugmentedDataInputStream.readLong(AugmentedDataInputStream.java:186) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.deserializeEventWindow(SyncUtils.java:640) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readTheirTipsAndEventWindow$3(SyncUtils.java:104) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more
node3 7m 54.529s 2025-09-24 20:32:54.674 8342 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 3 to 4>> NetworkUtils: Connection broken: 3 -> 4
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-24T20:32:54.674112718Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-24T20:32:54.674112718Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more