Node ID







Columns











Log Level





Log Marker







Class



















































node4 0.000ns 2025-09-26 05:44:45.193 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node2 23.000ms 2025-09-26 05:44:45.216 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node3 57.000ms 2025-09-26 05:44:45.250 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 85.000ms 2025-09-26 05:44:45.278 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 100.000ms 2025-09-26 05:44:45.293 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 109.000ms 2025-09-26 05:44:45.302 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node2 123.000ms 2025-09-26 05:44:45.316 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 141.000ms 2025-09-26 05:44:45.334 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node3 155.000ms 2025-09-26 05:44:45.348 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 211.000ms 2025-09-26 05:44:45.404 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 218.000ms 2025-09-26 05:44:45.411 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 229.000ms 2025-09-26 05:44:45.422 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 236.000ms 2025-09-26 05:44:45.429 4 INFO STARTUP <main> Browser: The following nodes [2] are set to run locally
node2 242.000ms 2025-09-26 05:44:45.435 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node2 254.000ms 2025-09-26 05:44:45.447 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 267.000ms 2025-09-26 05:44:45.460 4 INFO STARTUP <main> Browser: The following nodes [3] are set to run locally
node3 274.000ms 2025-09-26 05:44:45.467 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node3 286.000ms 2025-09-26 05:44:45.479 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 415.000ms 2025-09-26 05:44:45.608 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node1 514.000ms 2025-09-26 05:44:45.707 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node1 532.000ms 2025-09-26 05:44:45.725 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 556.000ms 2025-09-26 05:44:45.749 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 638.000ms 2025-09-26 05:44:45.831 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 639.000ms 2025-09-26 05:44:45.832 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node0 651.000ms 2025-09-26 05:44:45.844 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node1 661.000ms 2025-09-26 05:44:45.854 4 INFO STARTUP <main> Browser: The following nodes [1] are set to run locally
node0 667.000ms 2025-09-26 05:44:45.860 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 670.000ms 2025-09-26 05:44:45.863 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node2 679.000ms 2025-09-26 05:44:45.872 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node2 680.000ms 2025-09-26 05:44:45.873 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node1 686.000ms 2025-09-26 05:44:45.879 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 689.000ms 2025-09-26 05:44:45.882 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node3 690.000ms 2025-09-26 05:44:45.883 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node0 785.000ms 2025-09-26 05:44:45.978 4 INFO STARTUP <main> Browser: The following nodes [0] are set to run locally
node0 792.000ms 2025-09-26 05:44:45.985 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node0 805.000ms 2025-09-26 05:44:45.998 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 1.164s 2025-09-26 05:44:46.357 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node1 1.166s 2025-09-26 05:44:46.359 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node0 1.245s 2025-09-26 05:44:46.438 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node0 1.246s 2025-09-26 05:44:46.439 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 1.410s 2025-09-26 05:44:46.603 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 771ms
node4 1.417s 2025-09-26 05:44:46.610 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 1.420s 2025-09-26 05:44:46.613 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 1.452s 2025-09-26 05:44:46.645 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 1.510s 2025-09-26 05:44:46.703 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 1.511s 2025-09-26 05:44:46.704 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node2 1.529s 2025-09-26 05:44:46.722 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 849ms
node2 1.537s 2025-09-26 05:44:46.730 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node2 1.540s 2025-09-26 05:44:46.733 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 1.576s 2025-09-26 05:44:46.769 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node3 1.607s 2025-09-26 05:44:46.800 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 916ms
node3 1.616s 2025-09-26 05:44:46.809 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node3 1.619s 2025-09-26 05:44:46.812 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 1.640s 2025-09-26 05:44:46.833 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node2 1.641s 2025-09-26 05:44:46.834 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node3 1.672s 2025-09-26 05:44:46.865 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node3 1.731s 2025-09-26 05:44:46.924 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node3 1.732s 2025-09-26 05:44:46.925 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node1 2.264s 2025-09-26 05:44:47.457 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1098ms
node1 2.271s 2025-09-26 05:44:47.464 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node1 2.274s 2025-09-26 05:44:47.467 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 2.326s 2025-09-26 05:44:47.519 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node0 2.363s 2025-09-26 05:44:47.556 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1116ms
node0 2.371s 2025-09-26 05:44:47.564 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node0 2.374s 2025-09-26 05:44:47.567 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 2.391s 2025-09-26 05:44:47.584 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node1 2.392s 2025-09-26 05:44:47.585 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node0 2.413s 2025-09-26 05:44:47.606 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node0 2.485s 2025-09-26 05:44:47.678 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node0 2.486s 2025-09-26 05:44:47.679 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 3.575s 2025-09-26 05:44:48.768 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node2 3.631s 2025-09-26 05:44:48.824 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 3.654s 2025-09-26 05:44:48.847 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 3.656s 2025-09-26 05:44:48.849 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node4 3.657s 2025-09-26 05:44:48.850 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 3.712s 2025-09-26 05:44:48.905 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 3.715s 2025-09-26 05:44:48.908 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node2 3.715s 2025-09-26 05:44:48.908 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 3.785s 2025-09-26 05:44:48.978 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node3 3.871s 2025-09-26 05:44:49.064 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 3.873s 2025-09-26 05:44:49.066 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node3 3.874s 2025-09-26 05:44:49.067 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 4.436s 2025-09-26 05:44:49.629 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.439s 2025-09-26 05:44:49.632 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 4.446s 2025-09-26 05:44:49.639 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node2 4.456s 2025-09-26 05:44:49.649 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.456s 2025-09-26 05:44:49.649 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.458s 2025-09-26 05:44:49.651 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.459s 2025-09-26 05:44:49.652 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node2 4.465s 2025-09-26 05:44:49.658 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node2 4.476s 2025-09-26 05:44:49.669 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.477s 2025-09-26 05:44:49.670 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 4.507s 2025-09-26 05:44:49.700 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node1 4.544s 2025-09-26 05:44:49.737 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node0 4.602s 2025-09-26 05:44:49.795 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 4.605s 2025-09-26 05:44:49.798 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node0 4.606s 2025-09-26 05:44:49.799 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 4.627s 2025-09-26 05:44:49.820 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.630s 2025-09-26 05:44:49.823 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node1 4.631s 2025-09-26 05:44:49.824 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 4.657s 2025-09-26 05:44:49.850 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 4.661s 2025-09-26 05:44:49.854 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node3 4.668s 2025-09-26 05:44:49.861 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node3 4.681s 2025-09-26 05:44:49.874 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 4.683s 2025-09-26 05:44:49.876 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 5.440s 2025-09-26 05:44:50.633 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 5.444s 2025-09-26 05:44:50.637 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node0 5.452s 2025-09-26 05:44:50.645 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node1 5.456s 2025-09-26 05:44:50.649 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 5.460s 2025-09-26 05:44:50.653 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node0 5.466s 2025-09-26 05:44:50.659 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 5.466s 2025-09-26 05:44:50.659 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node0 5.469s 2025-09-26 05:44:50.662 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 5.476s 2025-09-26 05:44:50.669 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 5.478s 2025-09-26 05:44:50.671 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 5.578s 2025-09-26 05:44:50.771 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26231021] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=186010, randomLong=-4759948804046574928, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10140, randomLong=4071747361987332206, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1105250, data=35, exception=null] OS Health Check Report - Complete (took 1021 ms)
node4 5.586s 2025-09-26 05:44:50.779 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26238448] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=142050, randomLong=-1013979971415881399, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=8150, randomLong=-4969438903849226477, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1059937, data=35, exception=null] OS Health Check Report - Complete (took 1021 ms)
node2 5.609s 2025-09-26 05:44:50.802 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node2 5.617s 2025-09-26 05:44:50.810 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 5.617s 2025-09-26 05:44:50.810 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node2 5.622s 2025-09-26 05:44:50.815 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 5.624s 2025-09-26 05:44:50.817 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 5.630s 2025-09-26 05:44:50.823 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node2 5.697s 2025-09-26 05:44:50.890 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I8AGSQ==", "port": 30124 }, { "ipAddressV4": "CoAPwA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Ii2xKg==", "port": 30125 }, { "ipAddressV4": "CoAAfw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "Ii0K/A==", "port": 30126 }, { "ipAddressV4": "CoAPwQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ikgs/A==", "port": 30127 }, { "ipAddressV4": "CoAAfg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Iqoawg==", "port": 30128 }, { "ipAddressV4": "CoAAfQ==", "port": 30128 }] }] }
node4 5.708s 2025-09-26 05:44:50.901 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I8AGSQ==", "port": 30124 }, { "ipAddressV4": "CoAPwA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Ii2xKg==", "port": 30125 }, { "ipAddressV4": "CoAAfw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "Ii0K/A==", "port": 30126 }, { "ipAddressV4": "CoAPwQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ikgs/A==", "port": 30127 }, { "ipAddressV4": "CoAAfg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Iqoawg==", "port": 30128 }, { "ipAddressV4": "CoAAfQ==", "port": 30128 }] }] }
node2 5.717s 2025-09-26 05:44:50.910 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/2/ConsistencyTestLog.csv
node2 5.718s 2025-09-26 05:44:50.911 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node4 5.729s 2025-09-26 05:44:50.922 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 5.730s 2025-09-26 05:44:50.923 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node2 5.732s 2025-09-26 05:44:50.925 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: c334d0bcd2f13eaa694467d863d0cf77d3b85641c3de68eea65dd52be69e96411bf236bc14b886b83cf2ecb4b1ba89fa (root) ConsistencyTestingToolState / office-trigger-peanut-stem 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle
node4 5.744s 2025-09-26 05:44:50.937 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: c334d0bcd2f13eaa694467d863d0cf77d3b85641c3de68eea65dd52be69e96411bf236bc14b886b83cf2ecb4b1ba89fa (root) ConsistencyTestingToolState / office-trigger-peanut-stem 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle
node3 5.800s 2025-09-26 05:44:50.993 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26210282] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=168440, randomLong=-8896451858236211551, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10320, randomLong=-3105300862458126864, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1215298, data=35, exception=null] OS Health Check Report - Complete (took 1025 ms)
node3 5.834s 2025-09-26 05:44:51.027 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node3 5.842s 2025-09-26 05:44:51.035 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node3 5.849s 2025-09-26 05:44:51.042 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 5.932s 2025-09-26 05:44:51.125 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node2 5.936s 2025-09-26 05:44:51.129 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node3 5.936s 2025-09-26 05:44:51.129 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I8AGSQ==", "port": 30124 }, { "ipAddressV4": "CoAPwA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Ii2xKg==", "port": 30125 }, { "ipAddressV4": "CoAAfw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "Ii0K/A==", "port": 30126 }, { "ipAddressV4": "CoAPwQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ikgs/A==", "port": 30127 }, { "ipAddressV4": "CoAAfg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Iqoawg==", "port": 30128 }, { "ipAddressV4": "CoAAfQ==", "port": 30128 }] }] }
node4 5.936s 2025-09-26 05:44:51.129 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node2 5.941s 2025-09-26 05:44:51.134 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node4 5.941s 2025-09-26 05:44:51.134 47 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 5.941s 2025-09-26 05:44:51.134 48 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 5.942s 2025-09-26 05:44:51.135 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node2 5.945s 2025-09-26 05:44:51.138 47 INFO STARTUP <<start-node-2>> ConsistencyTestingToolMain: init called in Main for node 2.
node4 5.945s 2025-09-26 05:44:51.138 50 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node2 5.946s 2025-09-26 05:44:51.139 48 INFO STARTUP <<start-node-2>> SwirldsPlatform: Starting platform 2
node4 5.946s 2025-09-26 05:44:51.139 51 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node2 5.947s 2025-09-26 05:44:51.140 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 5.947s 2025-09-26 05:44:51.140 52 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 5.949s 2025-09-26 05:44:51.142 53 WARN STARTUP <<start-node-4>> PcesFileTracker: No preconsensus event files available
node2 5.950s 2025-09-26 05:44:51.143 50 INFO STARTUP <<start-node-2>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 5.950s 2025-09-26 05:44:51.143 54 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node2 5.951s 2025-09-26 05:44:51.144 51 INFO STARTUP <<start-node-2>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 5.951s 2025-09-26 05:44:51.144 55 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node2 5.952s 2025-09-26 05:44:51.145 52 INFO STARTUP <<start-node-2>> InputWireChecks: All input wires have been bound.
node2 5.953s 2025-09-26 05:44:51.146 53 WARN STARTUP <<start-node-2>> PcesFileTracker: No preconsensus event files available
node2 5.953s 2025-09-26 05:44:51.146 54 INFO STARTUP <<start-node-2>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node4 5.953s 2025-09-26 05:44:51.146 56 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 151.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 5.953s 2025-09-26 05:44:51.146 57 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node2 5.955s 2025-09-26 05:44:51.148 55 INFO STARTUP <<start-node-2>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node2 5.956s 2025-09-26 05:44:51.149 56 INFO STARTUP <<app: appMain 2>> ConsistencyTestingToolMain: run called in Main.
node2 5.957s 2025-09-26 05:44:51.150 57 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 172.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 5.958s 2025-09-26 05:44:51.151 58 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node3 5.959s 2025-09-26 05:44:51.152 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/3/ConsistencyTestLog.csv
node3 5.960s 2025-09-26 05:44:51.153 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node2 5.961s 2025-09-26 05:44:51.154 58 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node3 5.976s 2025-09-26 05:44:51.169 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: c334d0bcd2f13eaa694467d863d0cf77d3b85641c3de68eea65dd52be69e96411bf236bc14b886b83cf2ecb4b1ba89fa (root) ConsistencyTestingToolState / office-trigger-peanut-stem 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle
node3 6.189s 2025-09-26 05:44:51.382 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node3 6.195s 2025-09-26 05:44:51.388 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node3 6.201s 2025-09-26 05:44:51.394 47 INFO STARTUP <<start-node-3>> ConsistencyTestingToolMain: init called in Main for node 3.
node3 6.201s 2025-09-26 05:44:51.394 48 INFO STARTUP <<start-node-3>> SwirldsPlatform: Starting platform 3
node3 6.203s 2025-09-26 05:44:51.396 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node3 6.206s 2025-09-26 05:44:51.399 50 INFO STARTUP <<start-node-3>> CycleFinder: No cyclical back pressure detected in wiring model.
node3 6.207s 2025-09-26 05:44:51.400 51 INFO STARTUP <<start-node-3>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node3 6.208s 2025-09-26 05:44:51.401 52 INFO STARTUP <<start-node-3>> InputWireChecks: All input wires have been bound.
node3 6.209s 2025-09-26 05:44:51.402 53 WARN STARTUP <<start-node-3>> PcesFileTracker: No preconsensus event files available
node3 6.210s 2025-09-26 05:44:51.403 54 INFO STARTUP <<start-node-3>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node3 6.211s 2025-09-26 05:44:51.404 55 INFO STARTUP <<start-node-3>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node3 6.212s 2025-09-26 05:44:51.405 56 INFO STARTUP <<app: appMain 3>> ConsistencyTestingToolMain: run called in Main.
node3 6.213s 2025-09-26 05:44:51.406 57 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 174.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node3 6.218s 2025-09-26 05:44:51.411 58 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node0 6.579s 2025-09-26 05:44:51.772 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26192224] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=193719, randomLong=3213816245903846112, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10940, randomLong=2385481537721505578, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1254009, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms)
node1 6.589s 2025-09-26 05:44:51.782 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26209356] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=210800, randomLong=-1407971510707517445, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=9990, randomLong=-4470559091167452621, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1302130, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms)
node0 6.612s 2025-09-26 05:44:51.805 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node1 6.620s 2025-09-26 05:44:51.813 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node0 6.621s 2025-09-26 05:44:51.814 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node0 6.627s 2025-09-26 05:44:51.820 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node1 6.629s 2025-09-26 05:44:51.822 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node1 6.634s 2025-09-26 05:44:51.827 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node0 6.707s 2025-09-26 05:44:51.900 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I8AGSQ==", "port": 30124 }, { "ipAddressV4": "CoAPwA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Ii2xKg==", "port": 30125 }, { "ipAddressV4": "CoAAfw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "Ii0K/A==", "port": 30126 }, { "ipAddressV4": "CoAPwQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ikgs/A==", "port": 30127 }, { "ipAddressV4": "CoAAfg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Iqoawg==", "port": 30128 }, { "ipAddressV4": "CoAAfQ==", "port": 30128 }] }] }
node1 6.717s 2025-09-26 05:44:51.910 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I8AGSQ==", "port": 30124 }, { "ipAddressV4": "CoAPwA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Ii2xKg==", "port": 30125 }, { "ipAddressV4": "CoAAfw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "Ii0K/A==", "port": 30126 }, { "ipAddressV4": "CoAPwQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ikgs/A==", "port": 30127 }, { "ipAddressV4": "CoAAfg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Iqoawg==", "port": 30128 }, { "ipAddressV4": "CoAAfQ==", "port": 30128 }] }] }
node0 6.730s 2025-09-26 05:44:51.923 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/0/ConsistencyTestLog.csv
node0 6.731s 2025-09-26 05:44:51.924 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node1 6.739s 2025-09-26 05:44:51.932 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/1/ConsistencyTestLog.csv
node1 6.740s 2025-09-26 05:44:51.933 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node0 6.748s 2025-09-26 05:44:51.941 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: c334d0bcd2f13eaa694467d863d0cf77d3b85641c3de68eea65dd52be69e96411bf236bc14b886b83cf2ecb4b1ba89fa (root) ConsistencyTestingToolState / office-trigger-peanut-stem 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle
node1 6.755s 2025-09-26 05:44:51.948 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: c334d0bcd2f13eaa694467d863d0cf77d3b85641c3de68eea65dd52be69e96411bf236bc14b886b83cf2ecb4b1ba89fa (root) ConsistencyTestingToolState / office-trigger-peanut-stem 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle
node1 6.985s 2025-09-26 05:44:52.178 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node0 6.990s 2025-09-26 05:44:52.183 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node1 6.991s 2025-09-26 05:44:52.184 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node0 6.996s 2025-09-26 05:44:52.189 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node1 6.996s 2025-09-26 05:44:52.189 47 INFO STARTUP <<start-node-1>> ConsistencyTestingToolMain: init called in Main for node 1.
node1 6.997s 2025-09-26 05:44:52.190 48 INFO STARTUP <<start-node-1>> SwirldsPlatform: Starting platform 1
node1 6.998s 2025-09-26 05:44:52.191 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node0 7.001s 2025-09-26 05:44:52.194 47 INFO STARTUP <<start-node-0>> ConsistencyTestingToolMain: init called in Main for node 0.
node0 7.001s 2025-09-26 05:44:52.194 48 INFO STARTUP <<start-node-0>> SwirldsPlatform: Starting platform 0
node1 7.001s 2025-09-26 05:44:52.194 50 INFO STARTUP <<start-node-1>> CycleFinder: No cyclical back pressure detected in wiring model.
node0 7.002s 2025-09-26 05:44:52.195 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node1 7.003s 2025-09-26 05:44:52.196 51 INFO STARTUP <<start-node-1>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node1 7.003s 2025-09-26 05:44:52.196 52 INFO STARTUP <<start-node-1>> InputWireChecks: All input wires have been bound.
node1 7.005s 2025-09-26 05:44:52.198 53 WARN STARTUP <<start-node-1>> PcesFileTracker: No preconsensus event files available
node0 7.006s 2025-09-26 05:44:52.199 50 INFO STARTUP <<start-node-0>> CycleFinder: No cyclical back pressure detected in wiring model.
node1 7.006s 2025-09-26 05:44:52.199 54 INFO STARTUP <<start-node-1>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node0 7.007s 2025-09-26 05:44:52.200 51 INFO STARTUP <<start-node-0>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node0 7.007s 2025-09-26 05:44:52.200 52 INFO STARTUP <<start-node-0>> InputWireChecks: All input wires have been bound.
node1 7.007s 2025-09-26 05:44:52.200 55 INFO STARTUP <<start-node-1>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node0 7.009s 2025-09-26 05:44:52.202 53 WARN STARTUP <<start-node-0>> PcesFileTracker: No preconsensus event files available
node0 7.009s 2025-09-26 05:44:52.202 54 INFO STARTUP <<start-node-0>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node1 7.009s 2025-09-26 05:44:52.202 56 INFO STARTUP <<app: appMain 1>> ConsistencyTestingToolMain: run called in Main.
node1 7.010s 2025-09-26 05:44:52.203 57 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 195.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node0 7.011s 2025-09-26 05:44:52.204 55 INFO STARTUP <<start-node-0>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node0 7.012s 2025-09-26 05:44:52.205 56 INFO STARTUP <<app: appMain 0>> ConsistencyTestingToolMain: run called in Main.
node0 7.013s 2025-09-26 05:44:52.206 57 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 203.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node1 7.016s 2025-09-26 05:44:52.209 58 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node0 7.018s 2025-09-26 05:44:52.211 58 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 8.955s 2025-09-26 05:44:54.148 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node2 8.956s 2025-09-26 05:44:54.149 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting2.csv' ]
node4 8.957s 2025-09-26 05:44:54.150 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node2 8.958s 2025-09-26 05:44:54.151 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node3 9.213s 2025-09-26 05:44:54.406 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting3.csv' ]
node3 9.215s 2025-09-26 05:44:54.408 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node1 10.014s 2025-09-26 05:44:55.207 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting1.csv' ]
node0 10.018s 2025-09-26 05:44:55.211 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting0.csv' ]
node1 10.018s 2025-09-26 05:44:55.211 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node0 10.021s 2025-09-26 05:44:55.214 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 16.047s 2025-09-26 05:45:01.240 61 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node2 16.053s 2025-09-26 05:45:01.246 61 INFO PLATFORM_STATUS <platformForkJoinThread-5> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node3 16.309s 2025-09-26 05:45:01.502 61 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node1 17.104s 2025-09-26 05:45:02.297 61 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node0 17.108s 2025-09-26 05:45:02.301 61 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node2 17.907s 2025-09-26 05:45:03.100 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node1 17.948s 2025-09-26 05:45:03.141 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node3 17.963s 2025-09-26 05:45:03.156 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node0 17.993s 2025-09-26 05:45:03.186 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node4 18.032s 2025-09-26 05:45:03.225 62 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node1 18.376s 2025-09-26 05:45:03.569 63 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 1.3 s in CHECKING. Now in ACTIVE
node1 18.380s 2025-09-26 05:45:03.573 65 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node4 18.449s 2025-09-26 05:45:03.642 63 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 2.4 s in CHECKING. Now in ACTIVE
node3 18.450s 2025-09-26 05:45:03.643 63 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 2.1 s in CHECKING. Now in ACTIVE
node4 18.452s 2025-09-26 05:45:03.645 65 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node3 18.453s 2025-09-26 05:45:03.646 65 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node2 18.454s 2025-09-26 05:45:03.647 63 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 2.4 s in CHECKING. Now in ACTIVE
node2 18.458s 2025-09-26 05:45:03.651 65 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node0 18.482s 2025-09-26 05:45:03.675 63 INFO PLATFORM_STATUS <platformForkJoinThread-8> DefaultStatusStateMachine: Platform spent 1.4 s in CHECKING. Now in ACTIVE
node0 18.486s 2025-09-26 05:45:03.679 65 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node0 18.594s 2025-09-26 05:45:03.787 80 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2
node0 18.596s 2025-09-26 05:45:03.789 81 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node3 18.597s 2025-09-26 05:45:03.790 80 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2
node3 18.599s 2025-09-26 05:45:03.792 81 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node2 18.666s 2025-09-26 05:45:03.859 80 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2
node2 18.668s 2025-09-26 05:45:03.861 81 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node4 18.727s 2025-09-26 05:45:03.920 80 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2
node4 18.730s 2025-09-26 05:45:03.923 81 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node1 18.739s 2025-09-26 05:45:03.932 80 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2
node1 18.742s 2025-09-26 05:45:03.935 81 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node3 18.846s 2025-09-26 05:45:04.039 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node3 18.849s 2025-09-26 05:45:04.042 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-26T05:45:02.627078741Z Next consensus number: 12 Legacy running event hash: a03da5314eba7f4c0d0040fba88f469000a3daf6415e6a11f4e7222f52027f2f38f97e0b7c93e0b18c7f6835d18c3b92 Legacy running event mnemonic: wear-fly-struggle-trumpet Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: c8c90fc2d3c38f472911fdc164bbe97fad45c010701e96ffbacb909bb01cf5990c4afd9f1656a99c075a8021fcf33506 (root) ConsistencyTestingToolState / tragic-weapon-marble-blame 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 radar-couch-accident-style 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node0 18.861s 2025-09-26 05:45:04.054 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node0 18.865s 2025-09-26 05:45:04.058 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-26T05:45:02.627078741Z Next consensus number: 12 Legacy running event hash: a03da5314eba7f4c0d0040fba88f469000a3daf6415e6a11f4e7222f52027f2f38f97e0b7c93e0b18c7f6835d18c3b92 Legacy running event mnemonic: wear-fly-struggle-trumpet Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: c8c90fc2d3c38f472911fdc164bbe97fad45c010701e96ffbacb909bb01cf5990c4afd9f1656a99c075a8021fcf33506 (root) ConsistencyTestingToolState / tragic-weapon-marble-blame 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 radar-couch-accident-style 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node3 18.887s 2025-09-26 05:45:04.080 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+45+01.388448799Z_seq0_minr1_maxr501_orgn0.pces
node3 18.888s 2025-09-26 05:45:04.081 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+45+01.388448799Z_seq0_minr1_maxr501_orgn0.pces
node3 18.888s 2025-09-26 05:45:04.081 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 18.889s 2025-09-26 05:45:04.082 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 18.896s 2025-09-26 05:45:04.089 119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 18.915s 2025-09-26 05:45:04.108 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+45+01.471998334Z_seq0_minr1_maxr501_orgn0.pces
node0 18.916s 2025-09-26 05:45:04.109 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+45+01.471998334Z_seq0_minr1_maxr501_orgn0.pces
node0 18.917s 2025-09-26 05:45:04.110 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 18.919s 2025-09-26 05:45:04.112 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 18.922s 2025-09-26 05:45:04.115 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node2 18.926s 2025-09-26 05:45:04.119 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-26T05:45:02.627078741Z Next consensus number: 12 Legacy running event hash: a03da5314eba7f4c0d0040fba88f469000a3daf6415e6a11f4e7222f52027f2f38f97e0b7c93e0b18c7f6835d18c3b92 Legacy running event mnemonic: wear-fly-struggle-trumpet Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: c8c90fc2d3c38f472911fdc164bbe97fad45c010701e96ffbacb909bb01cf5990c4afd9f1656a99c075a8021fcf33506 (root) ConsistencyTestingToolState / tragic-weapon-marble-blame 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 radar-couch-accident-style 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node0 18.928s 2025-09-26 05:45:04.121 119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 18.963s 2025-09-26 05:45:04.156 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+45+01.285521102Z_seq0_minr1_maxr501_orgn0.pces
node2 18.964s 2025-09-26 05:45:04.157 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+45+01.285521102Z_seq0_minr1_maxr501_orgn0.pces
node2 18.964s 2025-09-26 05:45:04.157 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 18.965s 2025-09-26 05:45:04.158 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 18.972s 2025-09-26 05:45:04.165 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 18.981s 2025-09-26 05:45:04.174 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node4 18.984s 2025-09-26 05:45:04.177 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-26T05:45:02.627078741Z Next consensus number: 12 Legacy running event hash: a03da5314eba7f4c0d0040fba88f469000a3daf6415e6a11f4e7222f52027f2f38f97e0b7c93e0b18c7f6835d18c3b92 Legacy running event mnemonic: wear-fly-struggle-trumpet Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: c8c90fc2d3c38f472911fdc164bbe97fad45c010701e96ffbacb909bb01cf5990c4afd9f1656a99c075a8021fcf33506 (root) ConsistencyTestingToolState / tragic-weapon-marble-blame 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 radar-couch-accident-style 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node1 19.010s 2025-09-26 05:45:04.203 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node1 19.013s 2025-09-26 05:45:04.206 119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-09-26T05:45:02.627078741Z Next consensus number: 12 Legacy running event hash: a03da5314eba7f4c0d0040fba88f469000a3daf6415e6a11f4e7222f52027f2f38f97e0b7c93e0b18c7f6835d18c3b92 Legacy running event mnemonic: wear-fly-struggle-trumpet Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: c8c90fc2d3c38f472911fdc164bbe97fad45c010701e96ffbacb909bb01cf5990c4afd9f1656a99c075a8021fcf33506 (root) ConsistencyTestingToolState / tragic-weapon-marble-blame 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 radar-couch-accident-style 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 8898299034380133366 /3 nephew-deny-real-blanket 4 StringLeaf 1 /4 wreck-whale-old-bottom
node4 19.018s 2025-09-26 05:45:04.211 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+45+01.281104503Z_seq0_minr1_maxr501_orgn0.pces
node4 19.019s 2025-09-26 05:45:04.212 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+45+01.281104503Z_seq0_minr1_maxr501_orgn0.pces
node4 19.019s 2025-09-26 05:45:04.212 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 19.020s 2025-09-26 05:45:04.213 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 19.026s 2025-09-26 05:45:04.219 119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 19.050s 2025-09-26 05:45:04.243 120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+45+01.519856055Z_seq0_minr1_maxr501_orgn0.pces
node1 19.050s 2025-09-26 05:45:04.243 121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+45+01.519856055Z_seq0_minr1_maxr501_orgn0.pces
node1 19.051s 2025-09-26 05:45:04.244 122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 19.052s 2025-09-26 05:45:04.245 123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 19.058s 2025-09-26 05:45:04.251 124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 1m 16.291s 2025-09-26 05:46:01.484 1402 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 123 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 1m 16.325s 2025-09-26 05:46:01.518 1414 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 123 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 1m 16.394s 2025-09-26 05:46:01.587 1390 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 123 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 1m 16.406s 2025-09-26 05:46:01.599 1392 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 123 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 1m 16.533s 2025-09-26 05:46:01.726 1404 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 123 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 1m 16.545s 2025-09-26 05:46:01.738 1410 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 123 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/123
node1 1m 16.546s 2025-09-26 05:46:01.739 1411 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 123
node2 1m 16.621s 2025-09-26 05:46:01.814 1408 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 123 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/123
node2 1m 16.622s 2025-09-26 05:46:01.815 1409 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 123
node1 1m 16.632s 2025-09-26 05:46:01.825 1456 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 123
node1 1m 16.636s 2025-09-26 05:46:01.829 1457 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 123 Timestamp: 2025-09-26T05:46:00.046751031Z Next consensus number: 4633 Legacy running event hash: b4a10f77706e59084c7ab82252796b7c5c80d345e3f12b9a52b95ecb2f7f9c2fb61131db890e4987f0d2df9480a2741e Legacy running event mnemonic: hip-upon-assist-canal Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 219395129 Root hash: 6e091a641708bf601a05966f171d3f03b90e5b07afecfba8cfd5aaa05496b7cf041cd7a21bd4979281d3a9b2dbdb5c58 (root) ConsistencyTestingToolState / scorpion-erode-three-beef 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 soap-resist-muffin-decade 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 5676784361005942334 /3 toddler-museum-video-yellow 4 StringLeaf 122 /4 use-original-true-leader
node3 1m 16.636s 2025-09-26 05:46:01.829 1420 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 123 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/123
node3 1m 16.637s 2025-09-26 05:46:01.830 1421 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 123
node1 1m 16.646s 2025-09-26 05:46:01.839 1458 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+45+01.519856055Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 16.646s 2025-09-26 05:46:01.839 1459 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 95 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+45+01.519856055Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 16.647s 2025-09-26 05:46:01.840 1460 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 1m 16.650s 2025-09-26 05:46:01.843 1461 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 1m 16.651s 2025-09-26 05:46:01.844 1462 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 123 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/123 {"round":123,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/123/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 1m 16.692s 2025-09-26 05:46:01.885 1398 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 123 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/123
node0 1m 16.693s 2025-09-26 05:46:01.886 1399 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 123
node2 1m 16.699s 2025-09-26 05:46:01.892 1445 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 123
node2 1m 16.702s 2025-09-26 05:46:01.895 1446 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 123 Timestamp: 2025-09-26T05:46:00.046751031Z Next consensus number: 4633 Legacy running event hash: b4a10f77706e59084c7ab82252796b7c5c80d345e3f12b9a52b95ecb2f7f9c2fb61131db890e4987f0d2df9480a2741e Legacy running event mnemonic: hip-upon-assist-canal Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 219395129 Root hash: 6e091a641708bf601a05966f171d3f03b90e5b07afecfba8cfd5aaa05496b7cf041cd7a21bd4979281d3a9b2dbdb5c58 (root) ConsistencyTestingToolState / scorpion-erode-three-beef 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 soap-resist-muffin-decade 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 5676784361005942334 /3 toddler-museum-video-yellow 4 StringLeaf 122 /4 use-original-true-leader
node2 1m 16.711s 2025-09-26 05:46:01.904 1447 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+45+01.285521102Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 16.711s 2025-09-26 05:46:01.904 1448 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 95 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+45+01.285521102Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 16.711s 2025-09-26 05:46:01.904 1449 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 1m 16.715s 2025-09-26 05:46:01.908 1450 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 1m 16.715s 2025-09-26 05:46:01.908 1451 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 123 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/123 {"round":123,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/123/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 1m 16.721s 2025-09-26 05:46:01.914 1457 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 123
node3 1m 16.725s 2025-09-26 05:46:01.918 1458 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 123 Timestamp: 2025-09-26T05:46:00.046751031Z Next consensus number: 4633 Legacy running event hash: b4a10f77706e59084c7ab82252796b7c5c80d345e3f12b9a52b95ecb2f7f9c2fb61131db890e4987f0d2df9480a2741e Legacy running event mnemonic: hip-upon-assist-canal Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 219395129 Root hash: 6e091a641708bf601a05966f171d3f03b90e5b07afecfba8cfd5aaa05496b7cf041cd7a21bd4979281d3a9b2dbdb5c58 (root) ConsistencyTestingToolState / scorpion-erode-three-beef 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 soap-resist-muffin-decade 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 5676784361005942334 /3 toddler-museum-video-yellow 4 StringLeaf 122 /4 use-original-true-leader
node4 1m 16.727s 2025-09-26 05:46:01.920 1396 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 123 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/123
node4 1m 16.728s 2025-09-26 05:46:01.921 1397 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 123
node3 1m 16.737s 2025-09-26 05:46:01.930 1459 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+45+01.388448799Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 16.737s 2025-09-26 05:46:01.930 1460 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 95 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+45+01.388448799Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 16.737s 2025-09-26 05:46:01.930 1461 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 1m 16.741s 2025-09-26 05:46:01.934 1462 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 1m 16.741s 2025-09-26 05:46:01.934 1463 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 123 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/123 {"round":123,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/123/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 1m 16.781s 2025-09-26 05:46:01.974 1440 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 123
node0 1m 16.784s 2025-09-26 05:46:01.977 1441 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 123 Timestamp: 2025-09-26T05:46:00.046751031Z Next consensus number: 4633 Legacy running event hash: b4a10f77706e59084c7ab82252796b7c5c80d345e3f12b9a52b95ecb2f7f9c2fb61131db890e4987f0d2df9480a2741e Legacy running event mnemonic: hip-upon-assist-canal Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 219395129 Root hash: 6e091a641708bf601a05966f171d3f03b90e5b07afecfba8cfd5aaa05496b7cf041cd7a21bd4979281d3a9b2dbdb5c58 (root) ConsistencyTestingToolState / scorpion-erode-three-beef 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 soap-resist-muffin-decade 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 5676784361005942334 /3 toddler-museum-video-yellow 4 StringLeaf 122 /4 use-original-true-leader
node0 1m 16.794s 2025-09-26 05:46:01.987 1442 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+45+01.471998334Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 16.794s 2025-09-26 05:46:01.987 1443 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 95 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+45+01.471998334Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 16.794s 2025-09-26 05:46:01.987 1444 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 1m 16.798s 2025-09-26 05:46:01.991 1445 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 1m 16.799s 2025-09-26 05:46:01.992 1446 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 123 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/123 {"round":123,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/123/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 1m 16.809s 2025-09-26 05:46:02.002 1442 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 123
node4 1m 16.812s 2025-09-26 05:46:02.005 1443 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 123 Timestamp: 2025-09-26T05:46:00.046751031Z Next consensus number: 4633 Legacy running event hash: b4a10f77706e59084c7ab82252796b7c5c80d345e3f12b9a52b95ecb2f7f9c2fb61131db890e4987f0d2df9480a2741e Legacy running event mnemonic: hip-upon-assist-canal Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 219395129 Root hash: 6e091a641708bf601a05966f171d3f03b90e5b07afecfba8cfd5aaa05496b7cf041cd7a21bd4979281d3a9b2dbdb5c58 (root) ConsistencyTestingToolState / scorpion-erode-three-beef 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 soap-resist-muffin-decade 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 5676784361005942334 /3 toddler-museum-video-yellow 4 StringLeaf 122 /4 use-original-true-leader
node4 1m 16.821s 2025-09-26 05:46:02.014 1444 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+45+01.281104503Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 16.821s 2025-09-26 05:46:02.014 1445 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 95 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+45+01.281104503Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 16.821s 2025-09-26 05:46:02.014 1446 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 1m 16.825s 2025-09-26 05:46:02.018 1447 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 1m 16.825s 2025-09-26 05:46:02.018 1448 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 123 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/123 {"round":123,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/123/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 2m 16.075s 2025-09-26 05:47:01.268 2870 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 253 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 2m 16.089s 2025-09-26 05:47:01.282 2866 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 253 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 2m 16.144s 2025-09-26 05:47:01.337 2836 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 253 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 2m 16.167s 2025-09-26 05:47:01.360 2862 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 253 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 2m 16.181s 2025-09-26 05:47:01.374 2854 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 253 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 2m 16.239s 2025-09-26 05:47:01.432 2857 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 253 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/253
node1 2m 16.240s 2025-09-26 05:47:01.433 2858 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 253
node0 2m 16.313s 2025-09-26 05:47:01.506 2865 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 253 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/253
node0 2m 16.314s 2025-09-26 05:47:01.507 2866 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 253
node1 2m 16.338s 2025-09-26 05:47:01.531 2889 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 253
node1 2m 16.340s 2025-09-26 05:47:01.533 2890 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 253 Timestamp: 2025-09-26T05:47:00.195258Z Next consensus number: 9433 Legacy running event hash: 81491cf030224e5263b28a77197e169c3a0ca4c0f7ae774bc6fce16be774c04d797d554a1fc7e6df2b04e9fd2918cb9d Legacy running event mnemonic: street-own-affair-ski Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 891438579 Root hash: 4c8cf2a34c1bda916df07fc73a0b6804488e7231da24635d5c288242b3322be3b3d4fc664b25a4bfa952912e2b6dc736 (root) ConsistencyTestingToolState / muscle-produce-lock-pupil 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 add-first-general-olive 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -5183239733235124613 /3 victory-laptop-latin-soon 4 StringLeaf 252 /4 small-defy-decrease-youth
node1 2m 16.349s 2025-09-26 05:47:01.542 2891 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+45+01.519856055Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 16.350s 2025-09-26 05:47:01.543 2892 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 226 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+45+01.519856055Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 16.350s 2025-09-26 05:47:01.543 2893 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 2m 16.357s 2025-09-26 05:47:01.550 2894 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 2m 16.358s 2025-09-26 05:47:01.551 2895 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 253 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/253 {"round":253,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/253/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 2m 16.383s 2025-09-26 05:47:01.576 2869 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 253 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/253
node2 2m 16.384s 2025-09-26 05:47:01.577 2870 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 253
node4 2m 16.407s 2025-09-26 05:47:01.600 2849 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 253 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/253
node4 2m 16.408s 2025-09-26 05:47:01.601 2850 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 253
node0 2m 16.415s 2025-09-26 05:47:01.608 2901 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 253
node0 2m 16.418s 2025-09-26 05:47:01.611 2902 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 253 Timestamp: 2025-09-26T05:47:00.195258Z Next consensus number: 9433 Legacy running event hash: 81491cf030224e5263b28a77197e169c3a0ca4c0f7ae774bc6fce16be774c04d797d554a1fc7e6df2b04e9fd2918cb9d Legacy running event mnemonic: street-own-affair-ski Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 891438579 Root hash: 4c8cf2a34c1bda916df07fc73a0b6804488e7231da24635d5c288242b3322be3b3d4fc664b25a4bfa952912e2b6dc736 (root) ConsistencyTestingToolState / muscle-produce-lock-pupil 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 add-first-general-olive 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -5183239733235124613 /3 victory-laptop-latin-soon 4 StringLeaf 252 /4 small-defy-decrease-youth
node0 2m 16.426s 2025-09-26 05:47:01.619 2903 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+45+01.471998334Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 16.427s 2025-09-26 05:47:01.620 2904 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 226 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+45+01.471998334Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 16.427s 2025-09-26 05:47:01.620 2905 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 2m 16.434s 2025-09-26 05:47:01.627 2906 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 2m 16.434s 2025-09-26 05:47:01.627 2907 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 253 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/253 {"round":253,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/253/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 2m 16.444s 2025-09-26 05:47:01.637 2873 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 253 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/253
node3 2m 16.444s 2025-09-26 05:47:01.637 2874 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 253
node2 2m 16.470s 2025-09-26 05:47:01.663 2901 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 253
node2 2m 16.472s 2025-09-26 05:47:01.665 2902 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 253 Timestamp: 2025-09-26T05:47:00.195258Z Next consensus number: 9433 Legacy running event hash: 81491cf030224e5263b28a77197e169c3a0ca4c0f7ae774bc6fce16be774c04d797d554a1fc7e6df2b04e9fd2918cb9d Legacy running event mnemonic: street-own-affair-ski Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 891438579 Root hash: 4c8cf2a34c1bda916df07fc73a0b6804488e7231da24635d5c288242b3322be3b3d4fc664b25a4bfa952912e2b6dc736 (root) ConsistencyTestingToolState / muscle-produce-lock-pupil 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 add-first-general-olive 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -5183239733235124613 /3 victory-laptop-latin-soon 4 StringLeaf 252 /4 small-defy-decrease-youth
node2 2m 16.481s 2025-09-26 05:47:01.674 2903 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+45+01.285521102Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 16.481s 2025-09-26 05:47:01.674 2904 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 226 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+45+01.285521102Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 16.481s 2025-09-26 05:47:01.674 2905 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 2m 16.488s 2025-09-26 05:47:01.681 2906 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 2m 16.488s 2025-09-26 05:47:01.681 2907 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 253 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/253 {"round":253,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/253/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 2m 16.505s 2025-09-26 05:47:01.698 2881 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 253
node4 2m 16.507s 2025-09-26 05:47:01.700 2882 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 253 Timestamp: 2025-09-26T05:47:00.195258Z Next consensus number: 9433 Legacy running event hash: 81491cf030224e5263b28a77197e169c3a0ca4c0f7ae774bc6fce16be774c04d797d554a1fc7e6df2b04e9fd2918cb9d Legacy running event mnemonic: street-own-affair-ski Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 891438579 Root hash: 4c8cf2a34c1bda916df07fc73a0b6804488e7231da24635d5c288242b3322be3b3d4fc664b25a4bfa952912e2b6dc736 (root) ConsistencyTestingToolState / muscle-produce-lock-pupil 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 add-first-general-olive 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -5183239733235124613 /3 victory-laptop-latin-soon 4 StringLeaf 252 /4 small-defy-decrease-youth
node4 2m 16.518s 2025-09-26 05:47:01.711 2883 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+45+01.281104503Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 16.518s 2025-09-26 05:47:01.711 2884 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 226 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+45+01.281104503Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 16.518s 2025-09-26 05:47:01.711 2885 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 2m 16.525s 2025-09-26 05:47:01.718 2886 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 2m 16.526s 2025-09-26 05:47:01.719 2887 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 253 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/253 {"round":253,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/253/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 2m 16.534s 2025-09-26 05:47:01.727 2905 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 253
node3 2m 16.536s 2025-09-26 05:47:01.729 2906 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 253 Timestamp: 2025-09-26T05:47:00.195258Z Next consensus number: 9433 Legacy running event hash: 81491cf030224e5263b28a77197e169c3a0ca4c0f7ae774bc6fce16be774c04d797d554a1fc7e6df2b04e9fd2918cb9d Legacy running event mnemonic: street-own-affair-ski Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 891438579 Root hash: 4c8cf2a34c1bda916df07fc73a0b6804488e7231da24635d5c288242b3322be3b3d4fc664b25a4bfa952912e2b6dc736 (root) ConsistencyTestingToolState / muscle-produce-lock-pupil 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 add-first-general-olive 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -5183239733235124613 /3 victory-laptop-latin-soon 4 StringLeaf 252 /4 small-defy-decrease-youth
node3 2m 16.543s 2025-09-26 05:47:01.736 2907 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+45+01.388448799Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 16.544s 2025-09-26 05:47:01.737 2908 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 226 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+45+01.388448799Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 16.544s 2025-09-26 05:47:01.737 2909 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 2m 16.550s 2025-09-26 05:47:01.743 2910 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 2m 16.551s 2025-09-26 05:47:01.744 2911 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 253 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/253 {"round":253,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/253/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 3m 15.684s 2025-09-26 05:48:00.877 4339 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 388 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 3m 15.710s 2025-09-26 05:48:00.903 4359 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 388 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 3m 15.764s 2025-09-26 05:48:00.957 4359 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 388 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 3m 15.831s 2025-09-26 05:48:01.024 4389 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 388 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 3m 15.899s 2025-09-26 05:48:01.092 4392 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 388 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/388
node2 3m 15.899s 2025-09-26 05:48:01.092 4393 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 388
node0 3m 15.900s 2025-09-26 05:48:01.093 4352 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 388 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/388
node0 3m 15.901s 2025-09-26 05:48:01.094 4353 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 388
node3 3m 15.969s 2025-09-26 05:48:01.162 4362 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 388 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/388
node3 3m 15.970s 2025-09-26 05:48:01.163 4363 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 388
node2 3m 15.982s 2025-09-26 05:48:01.175 4428 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 388
node2 3m 15.983s 2025-09-26 05:48:01.176 4429 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 388 Timestamp: 2025-09-26T05:48:00.085346Z Next consensus number: 14138 Legacy running event hash: 1e0616a3a4615f54f9784260e3e639d27dfaebc8769f78df097735c24c1752087db23dd76e7ce51df9e743b3c9fc6c61 Legacy running event mnemonic: school-stamp-initial-tunnel Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1100198655 Root hash: 64d5d527ce5f9e1a6daee3754f65e5d051bd0acd3c130dd0126c103e379719f58b0d62d25181c1ea7b7305264a737e66 (root) ConsistencyTestingToolState / one-forest-crouch-devote 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 mother-cheap-advice-explain 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 4750617620723104032 /3 useful-harsh-increase-earn 4 StringLeaf 387 /4 margin-grace-hidden-wine
node2 3m 15.991s 2025-09-26 05:48:01.184 4430 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+45+01.285521102Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 15.992s 2025-09-26 05:48:01.185 4384 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 388
node2 3m 15.992s 2025-09-26 05:48:01.185 4431 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 360 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+45+01.285521102Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 15.992s 2025-09-26 05:48:01.185 4432 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 3m 15.994s 2025-09-26 05:48:01.187 4385 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 388 Timestamp: 2025-09-26T05:48:00.085346Z Next consensus number: 14138 Legacy running event hash: 1e0616a3a4615f54f9784260e3e639d27dfaebc8769f78df097735c24c1752087db23dd76e7ce51df9e743b3c9fc6c61 Legacy running event mnemonic: school-stamp-initial-tunnel Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1100198655 Root hash: 64d5d527ce5f9e1a6daee3754f65e5d051bd0acd3c130dd0126c103e379719f58b0d62d25181c1ea7b7305264a737e66 (root) ConsistencyTestingToolState / one-forest-crouch-devote 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 mother-cheap-advice-explain 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 4750617620723104032 /3 useful-harsh-increase-earn 4 StringLeaf 387 /4 margin-grace-hidden-wine
node2 3m 16.002s 2025-09-26 05:48:01.195 4433 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 3m 16.002s 2025-09-26 05:48:01.195 4434 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 388 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/388 {"round":388,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/388/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 3m 16.004s 2025-09-26 05:48:01.197 4386 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+45+01.471998334Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 16.004s 2025-09-26 05:48:01.197 4387 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 360 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+45+01.471998334Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 16.004s 2025-09-26 05:48:01.197 4388 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 3m 16.015s 2025-09-26 05:48:01.208 4397 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 3m 16.015s 2025-09-26 05:48:01.208 4398 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 388 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/388 {"round":388,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/388/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 3m 16.057s 2025-09-26 05:48:01.250 4398 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 388
node3 3m 16.059s 2025-09-26 05:48:01.252 4399 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 388 Timestamp: 2025-09-26T05:48:00.085346Z Next consensus number: 14138 Legacy running event hash: 1e0616a3a4615f54f9784260e3e639d27dfaebc8769f78df097735c24c1752087db23dd76e7ce51df9e743b3c9fc6c61 Legacy running event mnemonic: school-stamp-initial-tunnel Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1100198655 Root hash: 64d5d527ce5f9e1a6daee3754f65e5d051bd0acd3c130dd0126c103e379719f58b0d62d25181c1ea7b7305264a737e66 (root) ConsistencyTestingToolState / one-forest-crouch-devote 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 mother-cheap-advice-explain 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 4750617620723104032 /3 useful-harsh-increase-earn 4 StringLeaf 387 /4 margin-grace-hidden-wine
node3 3m 16.066s 2025-09-26 05:48:01.259 4400 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+45+01.388448799Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 16.066s 2025-09-26 05:48:01.259 4401 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 360 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+45+01.388448799Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 16.066s 2025-09-26 05:48:01.259 4402 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 3m 16.068s 2025-09-26 05:48:01.261 4372 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 388 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/388
node1 3m 16.070s 2025-09-26 05:48:01.263 4373 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 388
node3 3m 16.076s 2025-09-26 05:48:01.269 4403 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 3m 16.077s 2025-09-26 05:48:01.270 4404 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 388 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/388 {"round":388,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/388/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 3m 16.163s 2025-09-26 05:48:01.356 4412 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 388
node1 3m 16.165s 2025-09-26 05:48:01.358 4413 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 388 Timestamp: 2025-09-26T05:48:00.085346Z Next consensus number: 14138 Legacy running event hash: 1e0616a3a4615f54f9784260e3e639d27dfaebc8769f78df097735c24c1752087db23dd76e7ce51df9e743b3c9fc6c61 Legacy running event mnemonic: school-stamp-initial-tunnel Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1100198655 Root hash: 64d5d527ce5f9e1a6daee3754f65e5d051bd0acd3c130dd0126c103e379719f58b0d62d25181c1ea7b7305264a737e66 (root) ConsistencyTestingToolState / one-forest-crouch-devote 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 mother-cheap-advice-explain 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 4750617620723104032 /3 useful-harsh-increase-earn 4 StringLeaf 387 /4 margin-grace-hidden-wine
node1 3m 16.174s 2025-09-26 05:48:01.367 4414 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+45+01.519856055Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 16.174s 2025-09-26 05:48:01.367 4415 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 360 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+45+01.519856055Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 16.174s 2025-09-26 05:48:01.367 4416 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 3m 16.185s 2025-09-26 05:48:01.378 4417 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 3m 16.186s 2025-09-26 05:48:01.379 4418 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 388 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/388 {"round":388,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/388/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 4m 15.980s 2025-09-26 05:49:01.173 6104 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 527 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 4m 15.984s 2025-09-26 05:49:01.177 5918 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 527 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 4m 16.024s 2025-09-26 05:49:01.217 5936 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 527 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 4m 16.124s 2025-09-26 05:49:01.317 5954 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 527 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 4m 16.230s 2025-09-26 05:49:01.423 5921 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 527 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/527
node0 4m 16.230s 2025-09-26 05:49:01.423 5922 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 527
node2 4m 16.279s 2025-09-26 05:49:01.472 5957 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 527 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/527
node2 4m 16.279s 2025-09-26 05:49:01.472 5958 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 527
node3 4m 16.300s 2025-09-26 05:49:01.493 5939 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 527 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/527
node3 4m 16.301s 2025-09-26 05:49:01.494 5940 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 527
node0 4m 16.323s 2025-09-26 05:49:01.516 5953 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 527
node0 4m 16.325s 2025-09-26 05:49:01.518 5954 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 527 Timestamp: 2025-09-26T05:49:00.281551Z Next consensus number: 17473 Legacy running event hash: a79e0c7483bf0979d3f3f5dd2764fefcbce2b9f58f76c97e0dce57cf78018f28fb6b14c70fc05a6cc549e20c8f0f2fa9 Legacy running event mnemonic: arrange-model-market-connect Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1970050804 Root hash: fdf203eb7c291ba8a967a60db97cbc0c39cf3a48ff72eab747d97d45ae83e6c89799d826b74a43a65f059468ccccaa23 (root) ConsistencyTestingToolState / dove-latin-lunch-insect 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 dignity-juice-ship-orbit 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -3437894034850241049 /3 cabbage-pig-fence-wood 4 StringLeaf 526 /4 plastic-rookie-simple-corn
node1 4m 16.332s 2025-09-26 05:49:01.525 6117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 527 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/527
node1 4m 16.332s 2025-09-26 05:49:01.525 6118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 527
node0 4m 16.334s 2025-09-26 05:49:01.527 5955 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+45+01.471998334Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+48+49.995300241Z_seq1_minr474_maxr5474_orgn0.pces
node0 4m 16.334s 2025-09-26 05:49:01.527 5956 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus event files meeting specified criteria to copy.
Lower bound: 500 First file to copy: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+45+01.471998334Z_seq0_minr1_maxr501_orgn0.pces Last file to copy: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+48+49.995300241Z_seq1_minr474_maxr5474_orgn0.pces
node0 4m 16.334s 2025-09-26 05:49:01.527 5957 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 2 preconsensus event file(s)
node0 4m 16.347s 2025-09-26 05:49:01.540 5958 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 2 preconsensus event file(s)
node0 4m 16.347s 2025-09-26 05:49:01.540 5959 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 527 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/527 {"round":527,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/527/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 4m 16.370s 2025-09-26 05:49:01.563 5993 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 527
node2 4m 16.372s 2025-09-26 05:49:01.565 5994 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 527 Timestamp: 2025-09-26T05:49:00.281551Z Next consensus number: 17473 Legacy running event hash: a79e0c7483bf0979d3f3f5dd2764fefcbce2b9f58f76c97e0dce57cf78018f28fb6b14c70fc05a6cc549e20c8f0f2fa9 Legacy running event mnemonic: arrange-model-market-connect Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1970050804 Root hash: fdf203eb7c291ba8a967a60db97cbc0c39cf3a48ff72eab747d97d45ae83e6c89799d826b74a43a65f059468ccccaa23 (root) ConsistencyTestingToolState / dove-latin-lunch-insect 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 dignity-juice-ship-orbit 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -3437894034850241049 /3 cabbage-pig-fence-wood 4 StringLeaf 526 /4 plastic-rookie-simple-corn
node2 4m 16.379s 2025-09-26 05:49:01.572 5995 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+45+01.285521102Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+48+50.059759146Z_seq1_minr474_maxr5474_orgn0.pces
node2 4m 16.379s 2025-09-26 05:49:01.572 5996 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus event files meeting specified criteria to copy.
Lower bound: 500 First file to copy: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+45+01.285521102Z_seq0_minr1_maxr501_orgn0.pces Last file to copy: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+48+50.059759146Z_seq1_minr474_maxr5474_orgn0.pces
node2 4m 16.379s 2025-09-26 05:49:01.572 5997 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 2 preconsensus event file(s)
node3 4m 16.386s 2025-09-26 05:49:01.579 5971 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 527
node3 4m 16.388s 2025-09-26 05:49:01.581 5972 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 527 Timestamp: 2025-09-26T05:49:00.281551Z Next consensus number: 17473 Legacy running event hash: a79e0c7483bf0979d3f3f5dd2764fefcbce2b9f58f76c97e0dce57cf78018f28fb6b14c70fc05a6cc549e20c8f0f2fa9 Legacy running event mnemonic: arrange-model-market-connect Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1970050804 Root hash: fdf203eb7c291ba8a967a60db97cbc0c39cf3a48ff72eab747d97d45ae83e6c89799d826b74a43a65f059468ccccaa23 (root) ConsistencyTestingToolState / dove-latin-lunch-insect 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 dignity-juice-ship-orbit 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -3437894034850241049 /3 cabbage-pig-fence-wood 4 StringLeaf 526 /4 plastic-rookie-simple-corn
node2 4m 16.391s 2025-09-26 05:49:01.584 5998 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 2 preconsensus event file(s)
node2 4m 16.392s 2025-09-26 05:49:01.585 5999 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 527 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/527 {"round":527,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/527/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 4m 16.395s 2025-09-26 05:49:01.588 5973 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+48+50.021850797Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+45+01.388448799Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 16.395s 2025-09-26 05:49:01.588 5974 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus event files meeting specified criteria to copy.
Lower bound: 500 First file to copy: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+45+01.388448799Z_seq0_minr1_maxr501_orgn0.pces Last file to copy: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+48+50.021850797Z_seq1_minr474_maxr5474_orgn0.pces
node3 4m 16.396s 2025-09-26 05:49:01.589 5975 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 2 preconsensus event file(s)
node3 4m 16.408s 2025-09-26 05:49:01.601 5976 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 2 preconsensus event file(s)
node3 4m 16.408s 2025-09-26 05:49:01.601 5985 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 527 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/527 {"round":527,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/527/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 4m 16.434s 2025-09-26 05:49:01.627 6161 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 527
node1 4m 16.436s 2025-09-26 05:49:01.629 6162 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 527 Timestamp: 2025-09-26T05:49:00.281551Z Next consensus number: 17473 Legacy running event hash: a79e0c7483bf0979d3f3f5dd2764fefcbce2b9f58f76c97e0dce57cf78018f28fb6b14c70fc05a6cc549e20c8f0f2fa9 Legacy running event mnemonic: arrange-model-market-connect Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1970050804 Root hash: fdf203eb7c291ba8a967a60db97cbc0c39cf3a48ff72eab747d97d45ae83e6c89799d826b74a43a65f059468ccccaa23 (root) ConsistencyTestingToolState / dove-latin-lunch-insect 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 dignity-juice-ship-orbit 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -3437894034850241049 /3 cabbage-pig-fence-wood 4 StringLeaf 526 /4 plastic-rookie-simple-corn
node1 4m 16.445s 2025-09-26 05:49:01.638 6163 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+45+01.519856055Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+48+49.940654922Z_seq1_minr474_maxr5474_orgn0.pces
node1 4m 16.446s 2025-09-26 05:49:01.639 6164 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus event files meeting specified criteria to copy.
Lower bound: 500 First file to copy: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+45+01.519856055Z_seq0_minr1_maxr501_orgn0.pces Last file to copy: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+48+49.940654922Z_seq1_minr474_maxr5474_orgn0.pces
node1 4m 16.446s 2025-09-26 05:49:01.639 6165 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 2 preconsensus event file(s)
node1 4m 16.458s 2025-09-26 05:49:01.651 6166 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 2 preconsensus event file(s)
node1 4m 16.459s 2025-09-26 05:49:01.652 6167 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 527 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/527 {"round":527,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/527/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 5m 15.900s 2025-09-26 05:50:01.093 7842 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 665 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 5m 15.974s 2025-09-26 05:50:01.167 7486 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 665 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 5m 15.988s 2025-09-26 05:50:01.181 7500 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 665 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 5m 15.989s 2025-09-26 05:50:01.182 7508 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 665 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 5m 16.179s 2025-09-26 05:50:01.372 7503 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 665 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/665
node3 5m 16.180s 2025-09-26 05:50:01.373 7504 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 665
node0 5m 16.182s 2025-09-26 05:50:01.375 7489 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 665 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/665
node0 5m 16.182s 2025-09-26 05:50:01.375 7490 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 665
node2 5m 16.205s 2025-09-26 05:50:01.398 7511 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 665 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/665
node2 5m 16.206s 2025-09-26 05:50:01.399 7512 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 665
node3 5m 16.266s 2025-09-26 05:50:01.459 7539 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 665
node1 5m 16.267s 2025-09-26 05:50:01.460 7855 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 665 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/665
node1 5m 16.268s 2025-09-26 05:50:01.461 7856 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 665
node3 5m 16.268s 2025-09-26 05:50:01.461 7540 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 665 Timestamp: 2025-09-26T05:50:00.221147Z Next consensus number: 20765 Legacy running event hash: 127b7871dc58c4ca6e42a7db139a664bd0da305b59fd04930ecdc2f5fcffd58dce46976eb9d2df5bcde3af2c22d258b0 Legacy running event mnemonic: exhaust-athlete-insect-grocery Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -669701841 Root hash: b376afb6e6f2fd81315e7d6db12cc159763aa5ddf1d7bc15a33bb53c48eb554e399e6237c62d5c7c638f539a5a823407 (root) ConsistencyTestingToolState / mule-announce-canyon-exit 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 ripple-dwarf-walnut-tool 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -2928572788321669980 /3 hurdle-genre-donate-device 4 StringLeaf 664 /4 basket-mistake-wild-raw
node0 5m 16.271s 2025-09-26 05:50:01.464 7525 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 665
node0 5m 16.273s 2025-09-26 05:50:01.466 7526 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 665 Timestamp: 2025-09-26T05:50:00.221147Z Next consensus number: 20765 Legacy running event hash: 127b7871dc58c4ca6e42a7db139a664bd0da305b59fd04930ecdc2f5fcffd58dce46976eb9d2df5bcde3af2c22d258b0 Legacy running event mnemonic: exhaust-athlete-insect-grocery Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -669701841 Root hash: b376afb6e6f2fd81315e7d6db12cc159763aa5ddf1d7bc15a33bb53c48eb554e399e6237c62d5c7c638f539a5a823407 (root) ConsistencyTestingToolState / mule-announce-canyon-exit 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 ripple-dwarf-walnut-tool 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -2928572788321669980 /3 hurdle-genre-donate-device 4 StringLeaf 664 /4 basket-mistake-wild-raw
node3 5m 16.276s 2025-09-26 05:50:01.469 7541 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+48+50.021850797Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+45+01.388448799Z_seq0_minr1_maxr501_orgn0.pces
node3 5m 16.276s 2025-09-26 05:50:01.469 7542 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 638 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+48+50.021850797Z_seq1_minr474_maxr5474_orgn0.pces
node3 5m 16.276s 2025-09-26 05:50:01.469 7543 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 5m 16.279s 2025-09-26 05:50:01.472 7544 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 5m 16.279s 2025-09-26 05:50:01.472 7545 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 665 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/665 {"round":665,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/665/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 5m 16.281s 2025-09-26 05:50:01.474 7527 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+45+01.471998334Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+48+49.995300241Z_seq1_minr474_maxr5474_orgn0.pces
node0 5m 16.281s 2025-09-26 05:50:01.474 7528 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 638 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+48+49.995300241Z_seq1_minr474_maxr5474_orgn0.pces
node0 5m 16.281s 2025-09-26 05:50:01.474 7529 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 5m 16.281s 2025-09-26 05:50:01.474 7546 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2
node0 5m 16.284s 2025-09-26 05:50:01.477 7530 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 5m 16.285s 2025-09-26 05:50:01.478 7531 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 665 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/665 {"round":665,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/665/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 5m 16.286s 2025-09-26 05:50:01.479 7532 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2
node2 5m 16.293s 2025-09-26 05:50:01.486 7543 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 665
node2 5m 16.295s 2025-09-26 05:50:01.488 7544 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 665 Timestamp: 2025-09-26T05:50:00.221147Z Next consensus number: 20765 Legacy running event hash: 127b7871dc58c4ca6e42a7db139a664bd0da305b59fd04930ecdc2f5fcffd58dce46976eb9d2df5bcde3af2c22d258b0 Legacy running event mnemonic: exhaust-athlete-insect-grocery Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -669701841 Root hash: b376afb6e6f2fd81315e7d6db12cc159763aa5ddf1d7bc15a33bb53c48eb554e399e6237c62d5c7c638f539a5a823407 (root) ConsistencyTestingToolState / mule-announce-canyon-exit 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 ripple-dwarf-walnut-tool 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -2928572788321669980 /3 hurdle-genre-donate-device 4 StringLeaf 664 /4 basket-mistake-wild-raw
node2 5m 16.302s 2025-09-26 05:50:01.495 7545 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+45+01.285521102Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+48+50.059759146Z_seq1_minr474_maxr5474_orgn0.pces
node2 5m 16.302s 2025-09-26 05:50:01.495 7546 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 638 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+48+50.059759146Z_seq1_minr474_maxr5474_orgn0.pces
node2 5m 16.302s 2025-09-26 05:50:01.495 7547 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 5m 16.305s 2025-09-26 05:50:01.498 7548 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 5m 16.305s 2025-09-26 05:50:01.498 7549 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 665 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/665 {"round":665,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/665/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 5m 16.307s 2025-09-26 05:50:01.500 7550 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2
node1 5m 16.360s 2025-09-26 05:50:01.553 7895 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 665
node1 5m 16.362s 2025-09-26 05:50:01.555 7896 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 665 Timestamp: 2025-09-26T05:50:00.221147Z Next consensus number: 20765 Legacy running event hash: 127b7871dc58c4ca6e42a7db139a664bd0da305b59fd04930ecdc2f5fcffd58dce46976eb9d2df5bcde3af2c22d258b0 Legacy running event mnemonic: exhaust-athlete-insect-grocery Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -669701841 Root hash: b376afb6e6f2fd81315e7d6db12cc159763aa5ddf1d7bc15a33bb53c48eb554e399e6237c62d5c7c638f539a5a823407 (root) ConsistencyTestingToolState / mule-announce-canyon-exit 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 ripple-dwarf-walnut-tool 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -2928572788321669980 /3 hurdle-genre-donate-device 4 StringLeaf 664 /4 basket-mistake-wild-raw
node1 5m 16.370s 2025-09-26 05:50:01.563 7897 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+45+01.519856055Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+48+49.940654922Z_seq1_minr474_maxr5474_orgn0.pces
node1 5m 16.370s 2025-09-26 05:50:01.563 7898 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 638 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+48+49.940654922Z_seq1_minr474_maxr5474_orgn0.pces
node1 5m 16.371s 2025-09-26 05:50:01.564 7899 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 5m 16.374s 2025-09-26 05:50:01.567 7900 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 5m 16.374s 2025-09-26 05:50:01.567 7901 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 665 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/665 {"round":665,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/665/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 5m 16.376s 2025-09-26 05:50:01.569 7902 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2
node4 5m 50.502s 2025-09-26 05:50:35.695 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 5m 50.588s 2025-09-26 05:50:35.781 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 5m 50.603s 2025-09-26 05:50:35.796 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 50.713s 2025-09-26 05:50:35.906 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 5m 50.719s 2025-09-26 05:50:35.912 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 5m 50.731s 2025-09-26 05:50:35.924 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 51.137s 2025-09-26 05:50:36.330 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 5m 51.138s 2025-09-26 05:50:36.331 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 5m 51.924s 2025-09-26 05:50:37.117 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 785ms
node4 5m 51.932s 2025-09-26 05:50:37.125 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 5m 51.934s 2025-09-26 05:50:37.127 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 51.967s 2025-09-26 05:50:37.160 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 5m 52.035s 2025-09-26 05:50:37.228 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 5m 52.037s 2025-09-26 05:50:37.230 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 5m 54.077s 2025-09-26 05:50:39.270 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 5m 54.157s 2025-09-26 05:50:39.350 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 54.163s 2025-09-26 05:50:39.356 21 INFO STARTUP <main> StartupStateUtils: The following saved states were found on disk:
- /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/253/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/123/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2/SignedState.swh
node4 5m 54.164s 2025-09-26 05:50:39.357 22 INFO STARTUP <main> StartupStateUtils: Loading latest state from disk.
node4 5m 54.164s 2025-09-26 05:50:39.357 23 INFO STARTUP <main> StartupStateUtils: Loading signed state from disk: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/253/SignedState.swh
node4 5m 54.168s 2025-09-26 05:50:39.361 24 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 54.173s 2025-09-26 05:50:39.366 25 INFO STATE_TO_DISK <main> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp
node4 5m 54.299s 2025-09-26 05:50:39.492 36 INFO STARTUP <main> StartupStateUtils: Loaded state's hash is the same as when it was saved.
node4 5m 54.303s 2025-09-26 05:50:39.496 37 INFO STARTUP <main> StartupStateUtils: Platform has loaded a saved state {"round":253,"consensusTimestamp":"2025-09-26T05:47:00.195258Z"} [com.swirlds.logging.legacy.payload.SavedStateLoadedPayload]
node4 5m 54.305s 2025-09-26 05:50:39.498 40 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 54.307s 2025-09-26 05:50:39.500 43 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 5m 54.309s 2025-09-26 05:50:39.502 44 INFO STARTUP <main> AddressBookInitializer: Using the loaded state's address book and weight values.
node4 5m 54.316s 2025-09-26 05:50:39.509 45 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 54.317s 2025-09-26 05:50:39.510 46 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 55.346s 2025-09-26 05:50:40.539 47 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26233594] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=155770, randomLong=-1240562994359043426, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=8960, randomLong=-5788909897221253391, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=893160, data=35, exception=null] OS Health Check Report - Complete (took 1016 ms)
node4 5m 55.371s 2025-09-26 05:50:40.564 48 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 5m 55.496s 2025-09-26 05:50:40.689 49 INFO STARTUP <main> PcesUtilities: Span compaction completed for data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+45+01.281104503Z_seq0_minr1_maxr501_orgn0.pces, new upper bound is 376
node4 5m 55.498s 2025-09-26 05:50:40.691 50 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 5m 55.503s 2025-09-26 05:50:40.696 51 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 5m 55.570s 2025-09-26 05:50:40.763 52 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I8AGSQ==", "port": 30124 }, { "ipAddressV4": "CoAPwA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Ii2xKg==", "port": 30125 }, { "ipAddressV4": "CoAAfw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "Ii0K/A==", "port": 30126 }, { "ipAddressV4": "CoAPwQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ikgs/A==", "port": 30127 }, { "ipAddressV4": "CoAAfg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Iqoawg==", "port": 30128 }, { "ipAddressV4": "CoAAfQ==", "port": 30128 }] }] }
node4 5m 55.588s 2025-09-26 05:50:40.781 53 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with state long -5183239733235124613.
node4 5m 55.588s 2025-09-26 05:50:40.781 54 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with 252 rounds handled.
node4 5m 55.589s 2025-09-26 05:50:40.782 55 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 5m 55.589s 2025-09-26 05:50:40.782 56 INFO STARTUP <main> TransactionHandlingHistory: Log file found. Parsing previous history
node4 5m 56.318s 2025-09-26 05:50:41.511 57 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 253 Timestamp: 2025-09-26T05:47:00.195258Z Next consensus number: 9433 Legacy running event hash: 81491cf030224e5263b28a77197e169c3a0ca4c0f7ae774bc6fce16be774c04d797d554a1fc7e6df2b04e9fd2918cb9d Legacy running event mnemonic: street-own-affair-ski Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 891438579 Root hash: 4c8cf2a34c1bda916df07fc73a0b6804488e7231da24635d5c288242b3322be3b3d4fc664b25a4bfa952912e2b6dc736 (root) ConsistencyTestingToolState / muscle-produce-lock-pupil 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 add-first-general-olive 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -5183239733235124613 /3 victory-laptop-latin-soon 4 StringLeaf 252 /4 small-defy-decrease-youth
node4 5m 56.552s 2025-09-26 05:50:41.745 59 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 81491cf030224e5263b28a77197e169c3a0ca4c0f7ae774bc6fce16be774c04d797d554a1fc7e6df2b04e9fd2918cb9d
node4 5m 56.563s 2025-09-26 05:50:41.756 60 INFO STARTUP <platformForkJoinThread-4> Shadowgraph: Shadowgraph starting from expiration threshold 226
node4 5m 56.570s 2025-09-26 05:50:41.763 62 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 5m 56.571s 2025-09-26 05:50:41.764 63 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 5m 56.572s 2025-09-26 05:50:41.765 64 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 5m 56.575s 2025-09-26 05:50:41.768 65 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 5m 56.576s 2025-09-26 05:50:41.769 66 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 5m 56.577s 2025-09-26 05:50:41.770 67 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 5m 56.579s 2025-09-26 05:50:41.772 68 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 226
node4 5m 56.582s 2025-09-26 05:50:41.775 69 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 179.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 5m 56.868s 2025-09-26 05:50:42.061 70 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:4 H:f839a43420c4 BR:251), num remaining: 4
node4 5m 56.871s 2025-09-26 05:50:42.064 71 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:367da2d5a4a7 BR:251), num remaining: 3
node4 5m 56.872s 2025-09-26 05:50:42.065 72 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:de41dd04daed BR:251), num remaining: 2
node4 5m 56.873s 2025-09-26 05:50:42.066 73 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:0593c51f365a BR:252), num remaining: 1
node4 5m 56.874s 2025-09-26 05:50:42.067 74 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:db0c04e246c5 BR:251), num remaining: 0
node4 5m 57.760s 2025-09-26 05:50:42.953 1029 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 5,424 preconsensus events with max birth round 376. These events contained 7,497 transactions. 122 rounds reached consensus spanning 54.3 seconds of consensus time. The latest round to reach consensus is round 375. Replay took 1.2 seconds.
node4 5m 57.763s 2025-09-26 05:50:42.956 1031 INFO PLATFORM_STATUS <platformForkJoinThread-5> DefaultStatusStateMachine: Platform spent 1.2 s in REPLAYING_EVENTS. Now in OBSERVING
node4 5m 57.763s 2025-09-26 05:50:42.956 1030 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 5m 58.690s 2025-09-26 05:50:43.883 1198 INFO RECONNECT <<platform-core: SyncProtocolWith1 4 to 1>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=375,ancientThreshold=347,expiredThreshold=274] remote ev=EventWindow[latestConsensusRound=763,ancientThreshold=736,expiredThreshold=662]
node0 5m 58.760s 2025-09-26 05:50:43.953 8603 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=763,ancientThreshold=736,expiredThreshold=662] remote ev=EventWindow[latestConsensusRound=375,ancientThreshold=347,expiredThreshold=274]
node2 5m 58.760s 2025-09-26 05:50:43.953 8637 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=763,ancientThreshold=736,expiredThreshold=662] remote ev=EventWindow[latestConsensusRound=375,ancientThreshold=347,expiredThreshold=274]
node1 5m 58.761s 2025-09-26 05:50:43.954 9157 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=763,ancientThreshold=736,expiredThreshold=662] remote ev=EventWindow[latestConsensusRound=375,ancientThreshold=347,expiredThreshold=274]
node3 5m 58.761s 2025-09-26 05:50:43.954 8617 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=763,ancientThreshold=736,expiredThreshold=662] remote ev=EventWindow[latestConsensusRound=375,ancientThreshold=347,expiredThreshold=274]
node4 5m 58.829s 2025-09-26 05:50:44.022 1199 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=375,ancientThreshold=347,expiredThreshold=274] remote ev=EventWindow[latestConsensusRound=763,ancientThreshold=736,expiredThreshold=662]
node4 5m 58.830s 2025-09-26 05:50:44.023 1200 INFO RECONNECT <<platform-core: SyncProtocolWith3 4 to 3>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=375,ancientThreshold=347,expiredThreshold=274] remote ev=EventWindow[latestConsensusRound=763,ancientThreshold=736,expiredThreshold=662]
node4 5m 58.830s 2025-09-26 05:50:44.023 1201 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=375,ancientThreshold=347,expiredThreshold=274] remote ev=EventWindow[latestConsensusRound=763,ancientThreshold=736,expiredThreshold=662]
node4 5m 58.830s 2025-09-26 05:50:44.023 1202 INFO PLATFORM_STATUS <platformForkJoinThread-2> DefaultStatusStateMachine: Platform spent 1.1 s in OBSERVING. Now in BEHIND
node4 5m 58.831s 2025-09-26 05:50:44.024 1203 INFO RECONNECT <platformForkJoinThread-5> ReconnectController: Starting ReconnectController
node4 5m 58.832s 2025-09-26 05:50:44.025 1204 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, stopping gossip
node4 5m 58.983s 2025-09-26 05:50:44.176 1205 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, start clearing queues
node4 5m 58.985s 2025-09-26 05:50:44.178 1206 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Queues have been cleared
node4 5m 58.987s 2025-09-26 05:50:44.180 1207 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: waiting for reconnect connection
node4 5m 58.987s 2025-09-26 05:50:44.180 1208 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: acquired reconnect connection
node2 5m 59.079s 2025-09-26 05:50:44.272 8641 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectTeacher: Starting reconnect in the role of the sender {"receiving":false,"nodeId":2,"otherNodeId":4,"round":763} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node2 5m 59.080s 2025-09-26 05:50:44.273 8642 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectTeacher: The following state will be sent to the learner:
Round: 763 Timestamp: 2025-09-26T05:50:42.745577033Z Next consensus number: 23121 Legacy running event hash: 8aa1a1d25be598f24b8e9c7c46a7e2ad273b27d9f0eecee31ffa79dfad1ac5317f4e8c1c251fc9f9f6b3d924ed8977bd Legacy running event mnemonic: program-shift-lake-tide Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2096286556 Root hash: a2cb486d471eacb9633dd96480bbd81b3d9f15003b474dd617c541fa1daa33e6dada7745a5c851f4ecbb237b54be36ed (root) ConsistencyTestingToolState / pluck-asset-toilet-happy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 parade-valid-item-minimum 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -2613296028232452112 /3 index-call-believe-kind 4 StringLeaf 762 /4 virtual-monkey-weird-extend
node2 5m 59.080s 2025-09-26 05:50:44.273 8643 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectTeacher: Sending signatures from nodes 1, 2, 3 (signing weight = 37500000000/50000000000) for state hash a2cb486d471eacb9633dd96480bbd81b3d9f15003b474dd617c541fa1daa33e6dada7745a5c851f4ecbb237b54be36ed
node2 5m 59.081s 2025-09-26 05:50:44.274 8644 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectTeacher: Starting synchronization in the role of the sender.
node2 5m 59.085s 2025-09-26 05:50:44.278 8645 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node2 5m 59.093s 2025-09-26 05:50:44.286 8646 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@7b260bee start run()
node4 5m 59.147s 2025-09-26 05:50:44.340 1209 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Starting reconnect in role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":2,"round":375} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node4 5m 59.148s 2025-09-26 05:50:44.341 1210 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Receiving signed state signatures
node4 5m 59.151s 2025-09-26 05:50:44.344 1211 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Received signatures from nodes 1, 2, 3
node4 5m 59.154s 2025-09-26 05:50:44.347 1212 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls receiveTree()
node4 5m 59.154s 2025-09-26 05:50:44.347 1213 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronizing tree
node4 5m 59.154s 2025-09-26 05:50:44.347 1214 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 5m 59.160s 2025-09-26 05:50:44.353 1215 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@402efc51 start run()
node4 5m 59.165s 2025-09-26 05:50:44.358 1216 INFO STARTUP <<work group learning-synchronizer: async-input-stream #0>> ConsistencyTestingToolState: New State Constructed.
node2 5m 59.246s 2025-09-26 05:50:44.439 8665 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@7b260bee finish run()
node2 5m 59.246s 2025-09-26 05:50:44.439 8666 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> TeachingSynchronizer: finished sending tree
node2 5m 59.247s 2025-09-26 05:50:44.440 8667 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node2 5m 59.248s 2025-09-26 05:50:44.441 8668 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@10967f8e start run()
node4 5m 59.364s 2025-09-26 05:50:44.557 1238 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 5m 59.364s 2025-09-26 05:50:44.557 1239 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 5m 59.365s 2025-09-26 05:50:44.558 1240 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@402efc51 finish run()
node4 5m 59.365s 2025-09-26 05:50:44.558 1241 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 5m 59.365s 2025-09-26 05:50:44.558 1242 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 5m 59.368s 2025-09-26 05:50:44.561 1243 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@1fa9961 start run()
node4 5m 59.425s 2025-09-26 05:50:44.618 1244 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): firstLeafPath: 1 -> 1, lastLeafPath: 1 -> 1
node4 5m 59.426s 2025-09-26 05:50:44.619 1245 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): done
node4 5m 59.428s 2025-09-26 05:50:44.621 1246 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 5m 59.428s 2025-09-26 05:50:44.621 1247 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call nodeRemover.allNodesReceived()
node4 5m 59.428s 2025-09-26 05:50:44.621 1248 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived()
node4 5m 59.429s 2025-09-26 05:50:44.622 1249 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived(): done
node4 5m 59.429s 2025-09-26 05:50:44.622 1250 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call root.endLearnerReconnect()
node4 5m 59.429s 2025-09-26 05:50:44.622 1251 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call reconnectIterator.close()
node4 5m 59.429s 2025-09-26 05:50:44.622 1252 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call setHashPrivate()
node2 5m 59.498s 2025-09-26 05:50:44.691 8672 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@10967f8e finish run()
node2 5m 59.498s 2025-09-26 05:50:44.691 8673 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> TeachingSynchronizer: finished sending tree
node2 5m 59.501s 2025-09-26 05:50:44.694 8676 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectTeacher: Finished synchronization in the role of the sender.
node4 5m 59.583s 2025-09-26 05:50:44.776 1260 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 5m 59.585s 2025-09-26 05:50:44.778 1263 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call postInit()
node4 5m 59.585s 2025-09-26 05:50:44.778 1265 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: endLearnerReconnect() complete
node4 5m 59.586s 2025-09-26 05:50:44.779 1266 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: close() complete
node4 5m 59.586s 2025-09-26 05:50:44.779 1267 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 5m 59.586s 2025-09-26 05:50:44.779 1268 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@1fa9961 finish run()
node4 5m 59.586s 2025-09-26 05:50:44.779 1269 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 5m 59.587s 2025-09-26 05:50:44.780 1270 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 5m 59.587s 2025-09-26 05:50:44.780 1271 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronization complete
node4 5m 59.588s 2025-09-26 05:50:44.781 1272 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls initialize()
node4 5m 59.588s 2025-09-26 05:50:44.781 1273 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initializing tree
node4 5m 59.588s 2025-09-26 05:50:44.781 1274 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initialization complete
node4 5m 59.588s 2025-09-26 05:50:44.781 1275 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls hash()
node4 5m 59.589s 2025-09-26 05:50:44.782 1276 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing tree
node4 5m 59.589s 2025-09-26 05:50:44.782 1277 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing complete
node4 5m 59.590s 2025-09-26 05:50:44.783 1278 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls logStatistics()
node4 5m 59.593s 2025-09-26 05:50:44.786 1279 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: Finished synchronization {"timeInSeconds":0.433,"hashTimeInSeconds":0.0,"initializationTimeInSeconds":0.0,"totalNodes":12,"leafNodes":7,"redundantLeafNodes":4,"internalNodes":5,"redundantInternalNodes":2} [com.swirlds.logging.legacy.payload.SynchronizationCompletePayload]
node4 5m 59.593s 2025-09-26 05:50:44.786 1280 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: ReconnectMapMetrics: transfersFromTeacher=12; transfersFromLearner=10; internalHashes=0; internalCleanHashes=0; internalData=0; internalCleanData=0; leafHashes=3; leafCleanHashes=3; leafData=7; leafCleanData=4
node4 5m 59.593s 2025-09-26 05:50:44.786 1281 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner is done synchronizing
node4 5m 59.597s 2025-09-26 05:50:44.790 1282 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Reconnect data usage report {"dataMegabytes":0.006056785583496094} [com.swirlds.logging.legacy.payload.ReconnectDataUsagePayload]
node4 5m 59.600s 2025-09-26 05:50:44.793 1283 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Finished reconnect in the role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":2,"round":763,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 5m 59.601s 2025-09-26 05:50:44.794 1284 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Information for state received during reconnect:
Round: 763 Timestamp: 2025-09-26T05:50:42.745577033Z Next consensus number: 23121 Legacy running event hash: 8aa1a1d25be598f24b8e9c7c46a7e2ad273b27d9f0eecee31ffa79dfad1ac5317f4e8c1c251fc9f9f6b3d924ed8977bd Legacy running event mnemonic: program-shift-lake-tide Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2096286556 Root hash: a2cb486d471eacb9633dd96480bbd81b3d9f15003b474dd617c541fa1daa33e6dada7745a5c851f4ecbb237b54be36ed (root) ConsistencyTestingToolState / pluck-asset-toilet-happy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 parade-valid-item-minimum 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -2613296028232452112 /3 index-call-believe-kind 4 StringLeaf 762 /4 virtual-monkey-weird-extend
node4 5m 59.601s 2025-09-26 05:50:44.794 1286 DEBUG RECONNECT <<reconnect: reconnect-controller>> ReconnectStateLoader: `loadReconnectState` : reloading state
node4 5m 59.602s 2025-09-26 05:50:44.795 1287 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with state long -2613296028232452112.
node4 5m 59.602s 2025-09-26 05:50:44.795 1288 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with 762 rounds handled.
node4 5m 59.602s 2025-09-26 05:50:44.795 1289 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 5m 59.602s 2025-09-26 05:50:44.795 1290 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Log file found. Parsing previous history
node4 5m 59.620s 2025-09-26 05:50:44.813 1295 INFO STATE_TO_DISK <<reconnect: reconnect-controller>> DefaultSavedStateController: Signed state from round 763 created, will eventually be written to disk, for reason: RECONNECT
node4 5m 59.621s 2025-09-26 05:50:44.814 1296 INFO PLATFORM_STATUS <platformForkJoinThread-2> DefaultStatusStateMachine: Platform spent 789.0 ms in BEHIND. Now in RECONNECT_COMPLETE
node4 5m 59.621s 2025-09-26 05:50:44.814 1298 INFO STARTUP <platformForkJoinThread-8> Shadowgraph: Shadowgraph starting from expiration threshold 736
node4 5m 59.624s 2025-09-26 05:50:44.817 1300 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 763 state to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/763
node4 5m 59.625s 2025-09-26 05:50:44.818 1301 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/2 for round 763
node4 5m 59.634s 2025-09-26 05:50:44.827 1304 INFO EVENT_STREAM <<reconnect: reconnect-controller>> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 8aa1a1d25be598f24b8e9c7c46a7e2ad273b27d9f0eecee31ffa79dfad1ac5317f4e8c1c251fc9f9f6b3d924ed8977bd
node4 5m 59.636s 2025-09-26 05:50:44.829 1308 INFO STARTUP <platformForkJoinThread-4> PcesFileManager: Due to recent operations on this node, the local preconsensus event stream will have a discontinuity. The last file with the old origin round is 2025-09-26T05+45+01.281104503Z_seq0_minr1_maxr376_orgn0.pces. All future files will have an origin round of 763.
node2 5m 59.670s 2025-09-26 05:50:44.863 8677 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectTeacher: Finished reconnect in the role of the sender. {"receiving":false,"nodeId":2,"otherNodeId":4,"round":763,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 5m 59.764s 2025-09-26 05:50:44.957 1335 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/2 for round 763
node4 5m 59.767s 2025-09-26 05:50:44.960 1336 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 763 Timestamp: 2025-09-26T05:50:42.745577033Z Next consensus number: 23121 Legacy running event hash: 8aa1a1d25be598f24b8e9c7c46a7e2ad273b27d9f0eecee31ffa79dfad1ac5317f4e8c1c251fc9f9f6b3d924ed8977bd Legacy running event mnemonic: program-shift-lake-tide Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2096286556 Root hash: a2cb486d471eacb9633dd96480bbd81b3d9f15003b474dd617c541fa1daa33e6dada7745a5c851f4ecbb237b54be36ed (root) ConsistencyTestingToolState / pluck-asset-toilet-happy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 parade-valid-item-minimum 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -2613296028232452112 /3 index-call-believe-kind 4 StringLeaf 762 /4 virtual-monkey-weird-extend
node4 5m 59.807s 2025-09-26 05:50:45.000 1340 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+45+01.281104503Z_seq0_minr1_maxr376_orgn0.pces
node4 5m 59.809s 2025-09-26 05:50:45.002 1341 WARN STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: No preconsensus event files meeting specified criteria found to copy. Lower bound: 736
node4 5m 59.815s 2025-09-26 05:50:45.008 1342 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 763 to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/763 {"round":763,"freezeState":false,"reason":"RECONNECT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/763/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 5m 59.818s 2025-09-26 05:50:45.011 1343 INFO PLATFORM_STATUS <platformForkJoinThread-5> DefaultStatusStateMachine: Platform spent 196.0 ms in RECONNECT_COMPLETE. Now in CHECKING
node4 6.011m 2025-09-26 05:50:45.832 1344 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:3123c3097271 BR:761), num remaining: 3
node4 6.011m 2025-09-26 05:50:45.832 1345 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:c84a68a3f474 BR:761), num remaining: 2
node4 6.011m 2025-09-26 05:50:45.833 1346 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:39a68670c729 BR:761), num remaining: 1
node4 6.011m 2025-09-26 05:50:45.833 1347 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:0d61879d3a68 BR:761), num remaining: 0
node1 6m 3.742s 2025-09-26 05:50:48.935 9284 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=775,ancientThreshold=748,expiredThreshold=674] remote ev=EventWindow[latestConsensusRound=375,ancientThreshold=347,expiredThreshold=274]
node4 6m 3.812s 2025-09-26 05:50:49.005 1462 INFO RECONNECT <<platform-core: SyncProtocolWith1 4 to 1>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=375,ancientThreshold=347,expiredThreshold=274] remote ev=EventWindow[latestConsensusRound=775,ancientThreshold=748,expiredThreshold=674]
node4 6m 3.813s 2025-09-26 05:50:49.006 1463 INFO RECONNECT <<platform-core: SyncProtocolWith1 4 to 1>> RpcPeerHandler: Latest event window is not really falling behind, will retry sync local ev=EventWindow[latestConsensusRound=774,ancientThreshold=747,expiredThreshold=736] remote ev=EventWindow[latestConsensusRound=775,ancientThreshold=748,expiredThreshold=674]
node4 6m 4.961s 2025-09-26 05:50:50.154 1485 INFO PLATFORM_STATUS <platformForkJoinThread-2> DefaultStatusStateMachine: Platform spent 5.1 s in CHECKING. Now in ACTIVE
node3 6m 15.624s 2025-09-26 05:51:00.817 9049 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 802 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 6m 15.691s 2025-09-26 05:51:00.884 9578 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 802 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 6m 15.695s 2025-09-26 05:51:00.888 9027 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 802 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 6m 15.705s 2025-09-26 05:51:00.898 1755 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 802 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 6m 15.720s 2025-09-26 05:51:00.913 9098 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 802 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 6m 15.847s 2025-09-26 05:51:01.040 9101 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 802 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/802
node2 6m 15.851s 2025-09-26 05:51:01.044 9102 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/38 for round 802
node1 6m 15.865s 2025-09-26 05:51:01.058 9581 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 802 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/802
node1 6m 15.866s 2025-09-26 05:51:01.059 9582 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 802
node0 6m 15.936s 2025-09-26 05:51:01.129 9030 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 802 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/802
node0 6m 15.937s 2025-09-26 05:51:01.130 9031 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 802
node2 6m 15.953s 2025-09-26 05:51:01.146 9141 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/38 for round 802
node2 6m 15.955s 2025-09-26 05:51:01.148 9142 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 802 Timestamp: 2025-09-26T05:51:00.005543Z Next consensus number: 24370 Legacy running event hash: 05bb4776c97800a795074c760e909f8bc0477778b7c8d8689dbbac56e2fdfc36964e25d32994b6460244948ed4124ac5 Legacy running event mnemonic: resist-uphold-speed-belt Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1851216590 Root hash: 29fe73e01bf5bb022a86428b1596218a8c45465fb3d150919a7e7d1c9cf3554a2633299814fc26422cdd561875cbd4ac (root) ConsistencyTestingToolState / divert-lab-leaf-panther 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 often-woman-over-amused 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 8941973283600778650 /3 custom-empower-apart-damage 4 StringLeaf 801 /4 release-kingdom-various-pelican
node1 6m 15.961s 2025-09-26 05:51:01.154 9613 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 802
node1 6m 15.963s 2025-09-26 05:51:01.156 9614 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 802 Timestamp: 2025-09-26T05:51:00.005543Z Next consensus number: 24370 Legacy running event hash: 05bb4776c97800a795074c760e909f8bc0477778b7c8d8689dbbac56e2fdfc36964e25d32994b6460244948ed4124ac5 Legacy running event mnemonic: resist-uphold-speed-belt Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1851216590 Root hash: 29fe73e01bf5bb022a86428b1596218a8c45465fb3d150919a7e7d1c9cf3554a2633299814fc26422cdd561875cbd4ac (root) ConsistencyTestingToolState / divert-lab-leaf-panther 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 often-woman-over-amused 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 8941973283600778650 /3 custom-empower-apart-damage 4 StringLeaf 801 /4 release-kingdom-various-pelican
node2 6m 15.965s 2025-09-26 05:51:01.158 9143 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+45+01.285521102Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+48+50.059759146Z_seq1_minr474_maxr5474_orgn0.pces
node2 6m 15.966s 2025-09-26 05:51:01.159 9144 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 775 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+48+50.059759146Z_seq1_minr474_maxr5474_orgn0.pces
node2 6m 15.966s 2025-09-26 05:51:01.159 9145 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 6m 15.970s 2025-09-26 05:51:01.163 9615 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+45+01.519856055Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+48+49.940654922Z_seq1_minr474_maxr5474_orgn0.pces
node1 6m 15.970s 2025-09-26 05:51:01.163 9616 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 775 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+48+49.940654922Z_seq1_minr474_maxr5474_orgn0.pces
node1 6m 15.970s 2025-09-26 05:51:01.163 9617 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 6m 15.973s 2025-09-26 05:51:01.166 9146 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 6m 15.973s 2025-09-26 05:51:01.166 9147 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 802 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/802 {"round":802,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/802/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 6m 15.975s 2025-09-26 05:51:01.168 9148 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/123
node3 6m 15.975s 2025-09-26 05:51:01.168 9052 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 802 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/802
node1 6m 15.976s 2025-09-26 05:51:01.169 9618 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 6m 15.976s 2025-09-26 05:51:01.169 9619 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 802 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/802 {"round":802,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/802/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 6m 15.976s 2025-09-26 05:51:01.169 9053 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 802
node4 6m 15.976s 2025-09-26 05:51:01.169 1758 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 802 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/802
node4 6m 15.977s 2025-09-26 05:51:01.170 1767 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 802
node1 6m 15.978s 2025-09-26 05:51:01.171 9620 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/123
node0 6m 16.027s 2025-09-26 05:51:01.220 9066 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 802
node0 6m 16.029s 2025-09-26 05:51:01.222 9067 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 802 Timestamp: 2025-09-26T05:51:00.005543Z Next consensus number: 24370 Legacy running event hash: 05bb4776c97800a795074c760e909f8bc0477778b7c8d8689dbbac56e2fdfc36964e25d32994b6460244948ed4124ac5 Legacy running event mnemonic: resist-uphold-speed-belt Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1851216590 Root hash: 29fe73e01bf5bb022a86428b1596218a8c45465fb3d150919a7e7d1c9cf3554a2633299814fc26422cdd561875cbd4ac (root) ConsistencyTestingToolState / divert-lab-leaf-panther 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 often-woman-over-amused 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 8941973283600778650 /3 custom-empower-apart-damage 4 StringLeaf 801 /4 release-kingdom-various-pelican
node0 6m 16.034s 2025-09-26 05:51:01.227 9068 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+45+01.471998334Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+48+49.995300241Z_seq1_minr474_maxr5474_orgn0.pces
node0 6m 16.034s 2025-09-26 05:51:01.227 9069 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 775 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+48+49.995300241Z_seq1_minr474_maxr5474_orgn0.pces
node0 6m 16.035s 2025-09-26 05:51:01.228 9070 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 6m 16.040s 2025-09-26 05:51:01.233 9071 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 6m 16.041s 2025-09-26 05:51:01.234 9072 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 802 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/802 {"round":802,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/802/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 6m 16.042s 2025-09-26 05:51:01.235 9073 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/123
node3 6m 16.058s 2025-09-26 05:51:01.251 9088 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 802
node3 6m 16.060s 2025-09-26 05:51:01.253 9089 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 802 Timestamp: 2025-09-26T05:51:00.005543Z Next consensus number: 24370 Legacy running event hash: 05bb4776c97800a795074c760e909f8bc0477778b7c8d8689dbbac56e2fdfc36964e25d32994b6460244948ed4124ac5 Legacy running event mnemonic: resist-uphold-speed-belt Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1851216590 Root hash: 29fe73e01bf5bb022a86428b1596218a8c45465fb3d150919a7e7d1c9cf3554a2633299814fc26422cdd561875cbd4ac (root) ConsistencyTestingToolState / divert-lab-leaf-panther 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 often-woman-over-amused 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 8941973283600778650 /3 custom-empower-apart-damage 4 StringLeaf 801 /4 release-kingdom-various-pelican
node3 6m 16.067s 2025-09-26 05:51:01.260 9090 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+48+50.021850797Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+45+01.388448799Z_seq0_minr1_maxr501_orgn0.pces
node3 6m 16.067s 2025-09-26 05:51:01.260 9091 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 775 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+48+50.021850797Z_seq1_minr474_maxr5474_orgn0.pces
node3 6m 16.068s 2025-09-26 05:51:01.261 9092 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 6m 16.073s 2025-09-26 05:51:01.266 9093 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 6m 16.073s 2025-09-26 05:51:01.266 9094 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 802 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/802 {"round":802,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/802/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 6m 16.075s 2025-09-26 05:51:01.268 9095 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/123
node4 6m 16.133s 2025-09-26 05:51:01.326 1799 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 802
node4 6m 16.136s 2025-09-26 05:51:01.329 1800 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 802 Timestamp: 2025-09-26T05:51:00.005543Z Next consensus number: 24370 Legacy running event hash: 05bb4776c97800a795074c760e909f8bc0477778b7c8d8689dbbac56e2fdfc36964e25d32994b6460244948ed4124ac5 Legacy running event mnemonic: resist-uphold-speed-belt Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1851216590 Root hash: 29fe73e01bf5bb022a86428b1596218a8c45465fb3d150919a7e7d1c9cf3554a2633299814fc26422cdd561875cbd4ac (root) ConsistencyTestingToolState / divert-lab-leaf-panther 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 often-woman-over-amused 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf 8941973283600778650 /3 custom-empower-apart-damage 4 StringLeaf 801 /4 release-kingdom-various-pelican
node4 6m 16.149s 2025-09-26 05:51:01.342 1801 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+50+45.266952688Z_seq1_minr736_maxr1236_orgn763.pces Last file: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+45+01.281104503Z_seq0_minr1_maxr376_orgn0.pces
node4 6m 16.149s 2025-09-26 05:51:01.342 1802 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 775 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+50+45.266952688Z_seq1_minr736_maxr1236_orgn763.pces
node4 6m 16.150s 2025-09-26 05:51:01.343 1803 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 6m 16.153s 2025-09-26 05:51:01.346 1804 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 6m 16.154s 2025-09-26 05:51:01.347 1805 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 802 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/802 {"round":802,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/802/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7m 16.515s 2025-09-26 05:52:01.708 10535 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 935 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 7m 16.531s 2025-09-26 05:52:01.724 10614 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 935 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 7m 16.552s 2025-09-26 05:52:01.745 11088 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 935 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 7m 16.588s 2025-09-26 05:52:01.781 3271 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 935 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 7m 16.625s 2025-09-26 05:52:01.818 10579 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 935 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 7m 16.697s 2025-09-26 05:52:01.890 11094 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 935 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/935
node1 7m 16.698s 2025-09-26 05:52:01.891 11095 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 935
node3 7m 16.743s 2025-09-26 05:52:01.936 10585 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 935 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/935
node3 7m 16.744s 2025-09-26 05:52:01.937 10586 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 935
node2 7m 16.768s 2025-09-26 05:52:01.961 10632 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 935 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/935
node2 7m 16.768s 2025-09-26 05:52:01.961 10633 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/44 for round 935
node1 7m 16.786s 2025-09-26 05:52:01.979 11131 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 935
node0 7m 16.788s 2025-09-26 05:52:01.981 10553 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 935 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/935
node0 7m 16.789s 2025-09-26 05:52:01.982 10554 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 935
node1 7m 16.791s 2025-09-26 05:52:01.984 11132 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 935 Timestamp: 2025-09-26T05:52:00.225349Z Next consensus number: 29207 Legacy running event hash: 641b595ccd816fef0592c9e64524ad38cc228d669ec76294bbf75ba8d0e99ac75ea423c52ca681aee1443d9a32d24f4e Legacy running event mnemonic: lock-clog-exit-worth Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1368736575 Root hash: 90cd781086cc7cdd566b8b500ac740093cccb91f8acfac9d38dc804c5d9b33aa311947cf19a76a15f8aa52290aac95d4 (root) ConsistencyTestingToolState / toy-acoustic-charge-option 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 base-wall-food-rack 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -2980146730334453751 /3 lizard-pizza-fancy-moral 4 StringLeaf 934 /4 sure-build-opinion-civil
node1 7m 16.800s 2025-09-26 05:52:01.993 11133 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+45+01.519856055Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+48+49.940654922Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 16.800s 2025-09-26 05:52:01.993 11134 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 908 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+48+49.940654922Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 16.800s 2025-09-26 05:52:01.993 11135 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 7m 16.809s 2025-09-26 05:52:02.002 11144 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 7m 16.809s 2025-09-26 05:52:02.002 11145 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 935 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/935 {"round":935,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/935/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 7m 16.811s 2025-09-26 05:52:02.004 11146 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/253
node3 7m 16.832s 2025-09-26 05:52:02.025 10626 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 935
node3 7m 16.834s 2025-09-26 05:52:02.027 10627 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 935 Timestamp: 2025-09-26T05:52:00.225349Z Next consensus number: 29207 Legacy running event hash: 641b595ccd816fef0592c9e64524ad38cc228d669ec76294bbf75ba8d0e99ac75ea423c52ca681aee1443d9a32d24f4e Legacy running event mnemonic: lock-clog-exit-worth Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1368736575 Root hash: 90cd781086cc7cdd566b8b500ac740093cccb91f8acfac9d38dc804c5d9b33aa311947cf19a76a15f8aa52290aac95d4 (root) ConsistencyTestingToolState / toy-acoustic-charge-option 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 base-wall-food-rack 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -2980146730334453751 /3 lizard-pizza-fancy-moral 4 StringLeaf 934 /4 sure-build-opinion-civil
node3 7m 16.843s 2025-09-26 05:52:02.036 10628 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+48+50.021850797Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+45+01.388448799Z_seq0_minr1_maxr501_orgn0.pces
node3 7m 16.844s 2025-09-26 05:52:02.037 10629 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 908 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+48+50.021850797Z_seq1_minr474_maxr5474_orgn0.pces
node3 7m 16.844s 2025-09-26 05:52:02.037 10630 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 7m 16.853s 2025-09-26 05:52:02.046 10631 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 7m 16.853s 2025-09-26 05:52:02.046 10632 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 935 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/935 {"round":935,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/935/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 7m 16.855s 2025-09-26 05:52:02.048 10633 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/253
node2 7m 16.859s 2025-09-26 05:52:02.052 10669 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/44 for round 935
node2 7m 16.861s 2025-09-26 05:52:02.054 10670 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 935 Timestamp: 2025-09-26T05:52:00.225349Z Next consensus number: 29207 Legacy running event hash: 641b595ccd816fef0592c9e64524ad38cc228d669ec76294bbf75ba8d0e99ac75ea423c52ca681aee1443d9a32d24f4e Legacy running event mnemonic: lock-clog-exit-worth Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1368736575 Root hash: 90cd781086cc7cdd566b8b500ac740093cccb91f8acfac9d38dc804c5d9b33aa311947cf19a76a15f8aa52290aac95d4 (root) ConsistencyTestingToolState / toy-acoustic-charge-option 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 base-wall-food-rack 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -2980146730334453751 /3 lizard-pizza-fancy-moral 4 StringLeaf 934 /4 sure-build-opinion-civil
node2 7m 16.868s 2025-09-26 05:52:02.061 10671 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+45+01.285521102Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+48+50.059759146Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 16.869s 2025-09-26 05:52:02.062 10672 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 908 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+48+50.059759146Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 16.869s 2025-09-26 05:52:02.062 10673 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 7m 16.874s 2025-09-26 05:52:02.067 3277 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 935 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/935
node4 7m 16.875s 2025-09-26 05:52:02.068 3278 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/14 for round 935
node2 7m 16.878s 2025-09-26 05:52:02.071 10674 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 7m 16.878s 2025-09-26 05:52:02.071 10675 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 935 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/935 {"round":935,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/935/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7m 16.880s 2025-09-26 05:52:02.073 10594 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 935
node2 7m 16.880s 2025-09-26 05:52:02.073 10676 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/253
node0 7m 16.883s 2025-09-26 05:52:02.076 10595 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 935 Timestamp: 2025-09-26T05:52:00.225349Z Next consensus number: 29207 Legacy running event hash: 641b595ccd816fef0592c9e64524ad38cc228d669ec76294bbf75ba8d0e99ac75ea423c52ca681aee1443d9a32d24f4e Legacy running event mnemonic: lock-clog-exit-worth Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1368736575 Root hash: 90cd781086cc7cdd566b8b500ac740093cccb91f8acfac9d38dc804c5d9b33aa311947cf19a76a15f8aa52290aac95d4 (root) ConsistencyTestingToolState / toy-acoustic-charge-option 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 base-wall-food-rack 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -2980146730334453751 /3 lizard-pizza-fancy-moral 4 StringLeaf 934 /4 sure-build-opinion-civil
node0 7m 16.893s 2025-09-26 05:52:02.086 10596 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+45+01.471998334Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+48+49.995300241Z_seq1_minr474_maxr5474_orgn0.pces
node0 7m 16.893s 2025-09-26 05:52:02.086 10597 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 908 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+48+49.995300241Z_seq1_minr474_maxr5474_orgn0.pces
node0 7m 16.893s 2025-09-26 05:52:02.086 10598 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 7m 16.902s 2025-09-26 05:52:02.095 10599 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 7m 16.903s 2025-09-26 05:52:02.096 10600 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 935 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/935 {"round":935,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/935/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7m 16.904s 2025-09-26 05:52:02.097 10601 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/253
node4 7m 17.023s 2025-09-26 05:52:02.216 3321 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/14 for round 935
node4 7m 17.026s 2025-09-26 05:52:02.219 3322 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 935 Timestamp: 2025-09-26T05:52:00.225349Z Next consensus number: 29207 Legacy running event hash: 641b595ccd816fef0592c9e64524ad38cc228d669ec76294bbf75ba8d0e99ac75ea423c52ca681aee1443d9a32d24f4e Legacy running event mnemonic: lock-clog-exit-worth Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1368736575 Root hash: 90cd781086cc7cdd566b8b500ac740093cccb91f8acfac9d38dc804c5d9b33aa311947cf19a76a15f8aa52290aac95d4 (root) ConsistencyTestingToolState / toy-acoustic-charge-option 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 base-wall-food-rack 1 SingletonNode RosterService.ROSTER_STATE /1 shift-normal-battle-bonus 2 VirtualMap RosterService.ROSTERS /2 captain-cat-usual-hurdle 3 StringLeaf -2980146730334453751 /3 lizard-pizza-fancy-moral 4 StringLeaf 934 /4 sure-build-opinion-civil
node4 7m 17.036s 2025-09-26 05:52:02.229 3323 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+50+45.266952688Z_seq1_minr736_maxr1236_orgn763.pces Last file: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+45+01.281104503Z_seq0_minr1_maxr376_orgn0.pces
node4 7m 17.037s 2025-09-26 05:52:02.230 3324 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 908 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+50+45.266952688Z_seq1_minr736_maxr1236_orgn763.pces
node4 7m 17.037s 2025-09-26 05:52:02.230 3325 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 7m 17.042s 2025-09-26 05:52:02.235 3326 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 7m 17.042s 2025-09-26 05:52:02.235 3327 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 935 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/935 {"round":935,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/935/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 7m 17.044s 2025-09-26 05:52:02.237 3328 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2