Node ID







Columns











Log Level





Log Marker







Class


















































node4 0.000ns 2025-09-26 03:03:14.994 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 88.000ms 2025-09-26 03:03:15.082 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 104.000ms 2025-09-26 03:03:15.098 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 217.000ms 2025-09-26 03:03:15.211 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 223.000ms 2025-09-26 03:03:15.217 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 235.000ms 2025-09-26 03:03:15.229 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 653.000ms 2025-09-26 03:03:15.647 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 654.000ms 2025-09-26 03:03:15.648 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node0 839.000ms 2025-09-26 03:03:15.833 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node0 934.000ms 2025-09-26 03:03:15.928 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node0 952.000ms 2025-09-26 03:03:15.946 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 963.000ms 2025-09-26 03:03:15.957 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node2 1.052s 2025-09-26 03:03:16.046 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node2 1.068s 2025-09-26 03:03:16.062 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 1.074s 2025-09-26 03:03:16.068 4 INFO STARTUP <main> Browser: The following nodes [0] are set to run locally
node0 1.082s 2025-09-26 03:03:16.076 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node0 1.095s 2025-09-26 03:03:16.089 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 1.186s 2025-09-26 03:03:16.180 4 INFO STARTUP <main> Browser: The following nodes [2] are set to run locally
node2 1.194s 2025-09-26 03:03:16.188 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node2 1.206s 2025-09-26 03:03:16.200 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 1.371s 2025-09-26 03:03:16.365 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 1.475s 2025-09-26 03:03:16.469 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 820ms
node3 1.483s 2025-09-26 03:03:16.477 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 1.484s 2025-09-26 03:03:16.478 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 1.487s 2025-09-26 03:03:16.481 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 1.503s 2025-09-26 03:03:16.497 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 1.523s 2025-09-26 03:03:16.517 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node0 1.541s 2025-09-26 03:03:16.535 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node0 1.542s 2025-09-26 03:03:16.536 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 1.599s 2025-09-26 03:03:16.593 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 1.600s 2025-09-26 03:03:16.594 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node3 1.644s 2025-09-26 03:03:16.638 4 INFO STARTUP <main> Browser: The following nodes [3] are set to run locally
node2 1.645s 2025-09-26 03:03:16.639 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node2 1.646s 2025-09-26 03:03:16.640 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node3 1.652s 2025-09-26 03:03:16.646 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node3 1.666s 2025-09-26 03:03:16.660 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 1.883s 2025-09-26 03:03:16.877 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node1 1.980s 2025-09-26 03:03:16.974 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node1 1.998s 2025-09-26 03:03:16.992 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 2.121s 2025-09-26 03:03:17.115 4 INFO STARTUP <main> Browser: The following nodes [1] are set to run locally
node1 2.129s 2025-09-26 03:03:17.123 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node1 2.141s 2025-09-26 03:03:17.135 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 2.161s 2025-09-26 03:03:17.155 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node3 2.162s 2025-09-26 03:03:17.156 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node1 2.617s 2025-09-26 03:03:17.611 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node1 2.618s 2025-09-26 03:03:17.612 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node0 2.717s 2025-09-26 03:03:17.711 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1173ms
node0 2.726s 2025-09-26 03:03:17.720 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node0 2.730s 2025-09-26 03:03:17.724 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 2.734s 2025-09-26 03:03:17.728 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1088ms
node2 2.743s 2025-09-26 03:03:17.737 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node2 2.746s 2025-09-26 03:03:17.740 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 2.772s 2025-09-26 03:03:17.766 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node2 2.785s 2025-09-26 03:03:17.779 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node0 2.838s 2025-09-26 03:03:17.832 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node0 2.839s 2025-09-26 03:03:17.833 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node2 2.845s 2025-09-26 03:03:17.839 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node2 2.846s 2025-09-26 03:03:17.840 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node3 3.211s 2025-09-26 03:03:18.205 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1048ms
node3 3.221s 2025-09-26 03:03:18.215 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node3 3.225s 2025-09-26 03:03:18.219 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 3.266s 2025-09-26 03:03:18.260 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node3 3.329s 2025-09-26 03:03:18.323 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node3 3.330s 2025-09-26 03:03:18.324 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 3.628s 2025-09-26 03:03:18.622 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node1 3.687s 2025-09-26 03:03:18.681 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1068ms
node1 3.696s 2025-09-26 03:03:18.690 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node1 3.699s 2025-09-26 03:03:18.693 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 3.712s 2025-09-26 03:03:18.706 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 3.714s 2025-09-26 03:03:18.708 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node4 3.715s 2025-09-26 03:03:18.709 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 3.743s 2025-09-26 03:03:18.737 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node1 3.809s 2025-09-26 03:03:18.803 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node1 3.810s 2025-09-26 03:03:18.804 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 4.494s 2025-09-26 03:03:19.488 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.497s 2025-09-26 03:03:19.491 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 4.503s 2025-09-26 03:03:19.497 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node4 4.513s 2025-09-26 03:03:19.507 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.515s 2025-09-26 03:03:19.509 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 4.947s 2025-09-26 03:03:19.941 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node2 4.999s 2025-09-26 03:03:19.993 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node0 5.045s 2025-09-26 03:03:20.039 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 5.047s 2025-09-26 03:03:20.041 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node0 5.048s 2025-09-26 03:03:20.042 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 5.082s 2025-09-26 03:03:20.076 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 5.084s 2025-09-26 03:03:20.078 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node2 5.085s 2025-09-26 03:03:20.079 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 5.376s 2025-09-26 03:03:20.370 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node3 5.458s 2025-09-26 03:03:20.452 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 5.461s 2025-09-26 03:03:20.455 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node3 5.462s 2025-09-26 03:03:20.456 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5.633s 2025-09-26 03:03:20.627 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26230122] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=130040, randomLong=1807435557812068868, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10270, randomLong=-8589051423730855882, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1268690, data=35, exception=null] OS Health Check Report - Complete (took 1023 ms)
node4 5.665s 2025-09-26 03:03:20.659 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 5.673s 2025-09-26 03:03:20.667 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 5.678s 2025-09-26 03:03:20.672 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 5.758s 2025-09-26 03:03:20.752 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "InsIeg==", "port": 30124 }, { "ipAddressV4": "CoAAWQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "I+gaFw==", "port": 30125 }, { "ipAddressV4": "CoAAWw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "InlxqA==", "port": 30126 }, { "ipAddressV4": "CoAAWA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "aJrAnw==", "port": 30127 }, { "ipAddressV4": "CoAACA==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I+9Y8g==", "port": 30128 }, { "ipAddressV4": "CoAAWg==", "port": 30128 }] }] }
node4 5.778s 2025-09-26 03:03:20.772 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 5.779s 2025-09-26 03:03:20.773 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node4 5.793s 2025-09-26 03:03:20.787 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 936b097ef2ea874665663b7d461ba58c643bbcdd8386722f158e7585c74f47ef2399b90b628c1f2e0c2313a4aa25ba24 (root) ConsistencyTestingToolState / naive-cool-people-blanket 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub
node1 5.874s 2025-09-26 03:03:20.868 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node2 5.901s 2025-09-26 03:03:20.895 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 5.904s 2025-09-26 03:03:20.898 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node0 5.907s 2025-09-26 03:03:20.901 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 5.910s 2025-09-26 03:03:20.904 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node0 5.912s 2025-09-26 03:03:20.906 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node0 5.919s 2025-09-26 03:03:20.913 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node2 5.922s 2025-09-26 03:03:20.916 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 5.924s 2025-09-26 03:03:20.918 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 5.932s 2025-09-26 03:03:20.926 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 5.935s 2025-09-26 03:03:20.929 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 5.963s 2025-09-26 03:03:20.957 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 5.966s 2025-09-26 03:03:20.960 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node1 5.966s 2025-09-26 03:03:20.960 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5.989s 2025-09-26 03:03:20.983 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node4 5.994s 2025-09-26 03:03:20.988 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node4 5.999s 2025-09-26 03:03:20.993 47 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 6.000s 2025-09-26 03:03:20.994 48 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 6.001s 2025-09-26 03:03:20.995 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 6.004s 2025-09-26 03:03:20.998 50 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 6.005s 2025-09-26 03:03:20.999 51 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 6.006s 2025-09-26 03:03:21.000 52 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 6.008s 2025-09-26 03:03:21.002 53 WARN STARTUP <<start-node-4>> PcesFileTracker: No preconsensus event files available
node4 6.009s 2025-09-26 03:03:21.003 54 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node4 6.010s 2025-09-26 03:03:21.004 55 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node4 6.011s 2025-09-26 03:03:21.005 56 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 6.012s 2025-09-26 03:03:21.006 57 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 163.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 6.017s 2025-09-26 03:03:21.011 58 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node3 6.266s 2025-09-26 03:03:21.260 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 6.270s 2025-09-26 03:03:21.264 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node3 6.276s 2025-09-26 03:03:21.270 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node3 6.287s 2025-09-26 03:03:21.281 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 6.289s 2025-09-26 03:03:21.283 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 6.816s 2025-09-26 03:03:21.810 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 6.820s 2025-09-26 03:03:21.814 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node1 6.826s 2025-09-26 03:03:21.820 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node1 6.839s 2025-09-26 03:03:21.833 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 6.841s 2025-09-26 03:03:21.835 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 7.047s 2025-09-26 03:03:22.041 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26373800] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=177909, randomLong=-2907115563054037214, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10490, randomLong=6594801414907290404, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1641859, data=35, exception=null] OS Health Check Report - Complete (took 1023 ms)
node0 7.055s 2025-09-26 03:03:22.049 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26329271] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=213939, randomLong=-8935216261631504, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10280, randomLong=-5796717305946624007, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1493176, data=35, exception=null] OS Health Check Report - Complete (took 1025 ms)
node2 7.079s 2025-09-26 03:03:22.073 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node2 7.087s 2025-09-26 03:03:22.081 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node0 7.090s 2025-09-26 03:03:22.084 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node2 7.093s 2025-09-26 03:03:22.087 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node0 7.098s 2025-09-26 03:03:22.092 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node0 7.105s 2025-09-26 03:03:22.099 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node2 7.175s 2025-09-26 03:03:22.169 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "InsIeg==", "port": 30124 }, { "ipAddressV4": "CoAAWQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "I+gaFw==", "port": 30125 }, { "ipAddressV4": "CoAAWw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "InlxqA==", "port": 30126 }, { "ipAddressV4": "CoAAWA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "aJrAnw==", "port": 30127 }, { "ipAddressV4": "CoAACA==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I+9Y8g==", "port": 30128 }, { "ipAddressV4": "CoAAWg==", "port": 30128 }] }] }
node0 7.187s 2025-09-26 03:03:22.181 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "InsIeg==", "port": 30124 }, { "ipAddressV4": "CoAAWQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "I+gaFw==", "port": 30125 }, { "ipAddressV4": "CoAAWw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "InlxqA==", "port": 30126 }, { "ipAddressV4": "CoAAWA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "aJrAnw==", "port": 30127 }, { "ipAddressV4": "CoAACA==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I+9Y8g==", "port": 30128 }, { "ipAddressV4": "CoAAWg==", "port": 30128 }] }] }
node2 7.196s 2025-09-26 03:03:22.190 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/2/ConsistencyTestLog.csv
node2 7.196s 2025-09-26 03:03:22.190 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node0 7.210s 2025-09-26 03:03:22.204 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/0/ConsistencyTestLog.csv
node0 7.210s 2025-09-26 03:03:22.204 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node2 7.211s 2025-09-26 03:03:22.205 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 936b097ef2ea874665663b7d461ba58c643bbcdd8386722f158e7585c74f47ef2399b90b628c1f2e0c2313a4aa25ba24 (root) ConsistencyTestingToolState / naive-cool-people-blanket 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub
node0 7.226s 2025-09-26 03:03:22.220 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 936b097ef2ea874665663b7d461ba58c643bbcdd8386722f158e7585c74f47ef2399b90b628c1f2e0c2313a4aa25ba24 (root) ConsistencyTestingToolState / naive-cool-people-blanket 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub
node3 7.402s 2025-09-26 03:03:22.396 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26274431] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=200510, randomLong=397821809065087032, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10940, randomLong=491944550205202959, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1408730, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms)
node3 7.435s 2025-09-26 03:03:22.429 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node2 7.438s 2025-09-26 03:03:22.432 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node3 7.443s 2025-09-26 03:03:22.437 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node2 7.444s 2025-09-26 03:03:22.438 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node3 7.449s 2025-09-26 03:03:22.443 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node2 7.450s 2025-09-26 03:03:22.444 47 INFO STARTUP <<start-node-2>> ConsistencyTestingToolMain: init called in Main for node 2.
node2 7.451s 2025-09-26 03:03:22.445 48 INFO STARTUP <<start-node-2>> SwirldsPlatform: Starting platform 2
node2 7.452s 2025-09-26 03:03:22.446 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node2 7.456s 2025-09-26 03:03:22.450 50 INFO STARTUP <<start-node-2>> CycleFinder: No cyclical back pressure detected in wiring model.
node2 7.457s 2025-09-26 03:03:22.451 51 INFO STARTUP <<start-node-2>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node2 7.458s 2025-09-26 03:03:22.452 52 INFO STARTUP <<start-node-2>> InputWireChecks: All input wires have been bound.
node2 7.460s 2025-09-26 03:03:22.454 53 WARN STARTUP <<start-node-2>> PcesFileTracker: No preconsensus event files available
node2 7.460s 2025-09-26 03:03:22.454 54 INFO STARTUP <<start-node-2>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node2 7.462s 2025-09-26 03:03:22.456 55 INFO STARTUP <<start-node-2>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node2 7.463s 2025-09-26 03:03:22.457 56 INFO STARTUP <<app: appMain 2>> ConsistencyTestingToolMain: run called in Main.
node2 7.464s 2025-09-26 03:03:22.458 57 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 197.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node0 7.465s 2025-09-26 03:03:22.459 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node2 7.470s 2025-09-26 03:03:22.464 58 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node0 7.471s 2025-09-26 03:03:22.465 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node0 7.477s 2025-09-26 03:03:22.471 47 INFO STARTUP <<start-node-0>> ConsistencyTestingToolMain: init called in Main for node 0.
node0 7.477s 2025-09-26 03:03:22.471 48 INFO STARTUP <<start-node-0>> SwirldsPlatform: Starting platform 0
node0 7.478s 2025-09-26 03:03:22.472 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node0 7.482s 2025-09-26 03:03:22.476 50 INFO STARTUP <<start-node-0>> CycleFinder: No cyclical back pressure detected in wiring model.
node0 7.483s 2025-09-26 03:03:22.477 51 INFO STARTUP <<start-node-0>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node0 7.484s 2025-09-26 03:03:22.478 52 INFO STARTUP <<start-node-0>> InputWireChecks: All input wires have been bound.
node0 7.485s 2025-09-26 03:03:22.479 53 WARN STARTUP <<start-node-0>> PcesFileTracker: No preconsensus event files available
node0 7.485s 2025-09-26 03:03:22.479 54 INFO STARTUP <<start-node-0>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node0 7.487s 2025-09-26 03:03:22.481 55 INFO STARTUP <<start-node-0>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node0 7.488s 2025-09-26 03:03:22.482 56 INFO STARTUP <<app: appMain 0>> ConsistencyTestingToolMain: run called in Main.
node0 7.490s 2025-09-26 03:03:22.484 57 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 206.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node0 7.496s 2025-09-26 03:03:22.490 58 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 5.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node3 7.531s 2025-09-26 03:03:22.525 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "InsIeg==", "port": 30124 }, { "ipAddressV4": "CoAAWQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "I+gaFw==", "port": 30125 }, { "ipAddressV4": "CoAAWw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "InlxqA==", "port": 30126 }, { "ipAddressV4": "CoAAWA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "aJrAnw==", "port": 30127 }, { "ipAddressV4": "CoAACA==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I+9Y8g==", "port": 30128 }, { "ipAddressV4": "CoAAWg==", "port": 30128 }] }] }
node3 7.552s 2025-09-26 03:03:22.546 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/3/ConsistencyTestLog.csv
node3 7.553s 2025-09-26 03:03:22.547 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node3 7.568s 2025-09-26 03:03:22.562 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 936b097ef2ea874665663b7d461ba58c643bbcdd8386722f158e7585c74f47ef2399b90b628c1f2e0c2313a4aa25ba24 (root) ConsistencyTestingToolState / naive-cool-people-blanket 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub
node3 7.786s 2025-09-26 03:03:22.780 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node3 7.791s 2025-09-26 03:03:22.785 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node3 7.796s 2025-09-26 03:03:22.790 47 INFO STARTUP <<start-node-3>> ConsistencyTestingToolMain: init called in Main for node 3.
node3 7.796s 2025-09-26 03:03:22.790 48 INFO STARTUP <<start-node-3>> SwirldsPlatform: Starting platform 3
node3 7.797s 2025-09-26 03:03:22.791 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node3 7.801s 2025-09-26 03:03:22.795 50 INFO STARTUP <<start-node-3>> CycleFinder: No cyclical back pressure detected in wiring model.
node3 7.802s 2025-09-26 03:03:22.796 51 INFO STARTUP <<start-node-3>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node3 7.803s 2025-09-26 03:03:22.797 52 INFO STARTUP <<start-node-3>> InputWireChecks: All input wires have been bound.
node3 7.804s 2025-09-26 03:03:22.798 53 WARN STARTUP <<start-node-3>> PcesFileTracker: No preconsensus event files available
node3 7.805s 2025-09-26 03:03:22.799 54 INFO STARTUP <<start-node-3>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node3 7.806s 2025-09-26 03:03:22.800 55 INFO STARTUP <<start-node-3>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node3 7.808s 2025-09-26 03:03:22.802 56 INFO STARTUP <<app: appMain 3>> ConsistencyTestingToolMain: run called in Main.
node3 7.808s 2025-09-26 03:03:22.802 57 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 184.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node3 7.813s 2025-09-26 03:03:22.807 58 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node1 7.972s 2025-09-26 03:03:22.966 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26133525] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=232550, randomLong=6293842059532619753, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=12460, randomLong=7324291501810581907, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1260000, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms)
node1 8.007s 2025-09-26 03:03:23.001 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node1 8.016s 2025-09-26 03:03:23.010 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node1 8.022s 2025-09-26 03:03:23.016 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node1 8.104s 2025-09-26 03:03:23.098 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "InsIeg==", "port": 30124 }, { "ipAddressV4": "CoAAWQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "I+gaFw==", "port": 30125 }, { "ipAddressV4": "CoAAWw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "InlxqA==", "port": 30126 }, { "ipAddressV4": "CoAAWA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "aJrAnw==", "port": 30127 }, { "ipAddressV4": "CoAACA==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I+9Y8g==", "port": 30128 }, { "ipAddressV4": "CoAAWg==", "port": 30128 }] }] }
node1 8.126s 2025-09-26 03:03:23.120 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/1/ConsistencyTestLog.csv
node1 8.127s 2025-09-26 03:03:23.121 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node1 8.143s 2025-09-26 03:03:23.137 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 936b097ef2ea874665663b7d461ba58c643bbcdd8386722f158e7585c74f47ef2399b90b628c1f2e0c2313a4aa25ba24 (root) ConsistencyTestingToolState / naive-cool-people-blanket 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub
node1 8.383s 2025-09-26 03:03:23.377 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node1 8.388s 2025-09-26 03:03:23.382 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node1 8.393s 2025-09-26 03:03:23.387 47 INFO STARTUP <<start-node-1>> ConsistencyTestingToolMain: init called in Main for node 1.
node1 8.394s 2025-09-26 03:03:23.388 48 INFO STARTUP <<start-node-1>> SwirldsPlatform: Starting platform 1
node1 8.395s 2025-09-26 03:03:23.389 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node1 8.398s 2025-09-26 03:03:23.392 50 INFO STARTUP <<start-node-1>> CycleFinder: No cyclical back pressure detected in wiring model.
node1 8.400s 2025-09-26 03:03:23.394 51 INFO STARTUP <<start-node-1>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node1 8.400s 2025-09-26 03:03:23.394 52 INFO STARTUP <<start-node-1>> InputWireChecks: All input wires have been bound.
node1 8.402s 2025-09-26 03:03:23.396 53 WARN STARTUP <<start-node-1>> PcesFileTracker: No preconsensus event files available
node1 8.402s 2025-09-26 03:03:23.396 54 INFO STARTUP <<start-node-1>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node1 8.403s 2025-09-26 03:03:23.397 55 INFO STARTUP <<start-node-1>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node1 8.404s 2025-09-26 03:03:23.398 56 INFO STARTUP <<app: appMain 1>> ConsistencyTestingToolMain: run called in Main.
node1 8.406s 2025-09-26 03:03:23.400 57 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 203.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node1 8.411s 2025-09-26 03:03:23.405 58 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 9.011s 2025-09-26 03:03:24.005 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 9.012s 2025-09-26 03:03:24.006 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node2 10.463s 2025-09-26 03:03:25.457 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting2.csv' ]
node2 10.466s 2025-09-26 03:03:25.460 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node0 10.490s 2025-09-26 03:03:25.484 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting0.csv' ]
node0 10.493s 2025-09-26 03:03:25.487 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node3 10.811s 2025-09-26 03:03:25.805 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting3.csv' ]
node3 10.814s 2025-09-26 03:03:25.808 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node1 11.407s 2025-09-26 03:03:26.401 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting1.csv' ]
node1 11.410s 2025-09-26 03:03:26.404 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 16.107s 2025-09-26 03:03:31.101 61 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node2 17.559s 2025-09-26 03:03:32.553 61 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node0 17.585s 2025-09-26 03:03:32.579 61 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node3 17.904s 2025-09-26 03:03:32.898 61 INFO PLATFORM_STATUS <platformForkJoinThread-2> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node1 18.503s 2025-09-26 03:03:33.497 61 INFO PLATFORM_STATUS <platformForkJoinThread-7> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node2 18.818s 2025-09-26 03:03:33.812 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node3 18.863s 2025-09-26 03:03:33.857 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node1 18.879s 2025-09-26 03:03:33.873 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node0 18.898s 2025-09-26 03:03:33.892 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node4 18.944s 2025-09-26 03:03:33.938 62 INFO PLATFORM_STATUS <platformForkJoinThread-7> DefaultStatusStateMachine: Platform spent 2.8 s in CHECKING. Now in ACTIVE
node4 18.946s 2025-09-26 03:03:33.940 64 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node0 19.004s 2025-09-26 03:03:33.998 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1
node0 19.007s 2025-09-26 03:03:34.001 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node3 19.043s 2025-09-26 03:03:34.037 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1
node3 19.045s 2025-09-26 03:03:34.039 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node2 19.051s 2025-09-26 03:03:34.045 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1
node2 19.053s 2025-09-26 03:03:34.047 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node1 19.073s 2025-09-26 03:03:34.067 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1
node4 19.073s 2025-09-26 03:03:34.067 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1
node1 19.075s 2025-09-26 03:03:34.069 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node4 19.075s 2025-09-26 03:03:34.069 80 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node2 19.290s 2025-09-26 03:03:34.284 107 INFO PLATFORM_STATUS <platformForkJoinThread-7> DefaultStatusStateMachine: Platform spent 1.7 s in CHECKING. Now in ACTIVE
node2 19.303s 2025-09-26 03:03:34.297 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node3 19.305s 2025-09-26 03:03:34.299 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node2 19.306s 2025-09-26 03:03:34.300 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-09-26T03:03:32.589231021Z Next consensus number: 1 Legacy running event hash: 9c21c1369cab24cc30aac2df70ab3d058c1653896eda23329183ff8478854b5d59d1ab511ff49a78f65ab186beb9cc19 Legacy running event mnemonic: know-sun-height-office Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: a50340269daf0ab356ff1589df04ccb09b01e33ce8a3553957fd92bd100d68486103a441212fda3b92308610c9bf58b6 (root) ConsistencyTestingToolState / garden-age-enforce-filter 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 foot-brown-monkey-dumb 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node3 19.309s 2025-09-26 03:03:34.303 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-09-26T03:03:32.589231021Z Next consensus number: 1 Legacy running event hash: 9c21c1369cab24cc30aac2df70ab3d058c1653896eda23329183ff8478854b5d59d1ab511ff49a78f65ab186beb9cc19 Legacy running event mnemonic: know-sun-height-office Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: a50340269daf0ab356ff1589df04ccb09b01e33ce8a3553957fd92bd100d68486103a441212fda3b92308610c9bf58b6 (root) ConsistencyTestingToolState / garden-age-enforce-filter 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 foot-brown-monkey-dumb 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node0 19.310s 2025-09-26 03:03:34.304 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node0 19.313s 2025-09-26 03:03:34.307 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-09-26T03:03:32.589231021Z Next consensus number: 1 Legacy running event hash: 9c21c1369cab24cc30aac2df70ab3d058c1653896eda23329183ff8478854b5d59d1ab511ff49a78f65ab186beb9cc19 Legacy running event mnemonic: know-sun-height-office Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: a50340269daf0ab356ff1589df04ccb09b01e33ce8a3553957fd92bd100d68486103a441212fda3b92308610c9bf58b6 (root) ConsistencyTestingToolState / garden-age-enforce-filter 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 foot-brown-monkey-dumb 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node4 19.332s 2025-09-26 03:03:34.326 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node4 19.335s 2025-09-26 03:03:34.329 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-09-26T03:03:32.589231021Z Next consensus number: 1 Legacy running event hash: 9c21c1369cab24cc30aac2df70ab3d058c1653896eda23329183ff8478854b5d59d1ab511ff49a78f65ab186beb9cc19 Legacy running event mnemonic: know-sun-height-office Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: a50340269daf0ab356ff1589df04ccb09b01e33ce8a3553957fd92bd100d68486103a441212fda3b92308610c9bf58b6 (root) ConsistencyTestingToolState / garden-age-enforce-filter 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 foot-brown-monkey-dumb 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node3 19.336s 2025-09-26 03:03:34.330 112 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 1.4 s in CHECKING. Now in ACTIVE
node2 19.344s 2025-09-26 03:03:34.338 119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+03+31.391433828Z_seq0_minr1_maxr501_orgn0.pces
node2 19.344s 2025-09-26 03:03:34.338 120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+03+31.391433828Z_seq0_minr1_maxr501_orgn0.pces
node2 19.345s 2025-09-26 03:03:34.339 121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 19.345s 2025-09-26 03:03:34.339 122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 19.351s 2025-09-26 03:03:34.345 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+03+31.256200715Z_seq0_minr1_maxr501_orgn0.pces
node3 19.351s 2025-09-26 03:03:34.345 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+03+31.256200715Z_seq0_minr1_maxr501_orgn0.pces
node2 19.352s 2025-09-26 03:03:34.346 123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 19.352s 2025-09-26 03:03:34.346 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 19.353s 2025-09-26 03:03:34.347 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+03+31.273927897Z_seq0_minr1_maxr501_orgn0.pces
node0 19.353s 2025-09-26 03:03:34.347 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+03+31.273927897Z_seq0_minr1_maxr501_orgn0.pces
node3 19.353s 2025-09-26 03:03:34.347 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 19.354s 2025-09-26 03:03:34.348 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 19.355s 2025-09-26 03:03:34.349 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 19.359s 2025-09-26 03:03:34.353 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node3 19.359s 2025-09-26 03:03:34.353 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 19.360s 2025-09-26 03:03:34.354 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 19.363s 2025-09-26 03:03:34.357 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-09-26T03:03:32.589231021Z Next consensus number: 1 Legacy running event hash: 9c21c1369cab24cc30aac2df70ab3d058c1653896eda23329183ff8478854b5d59d1ab511ff49a78f65ab186beb9cc19 Legacy running event mnemonic: know-sun-height-office Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: a50340269daf0ab356ff1589df04ccb09b01e33ce8a3553957fd92bd100d68486103a441212fda3b92308610c9bf58b6 (root) ConsistencyTestingToolState / garden-age-enforce-filter 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 foot-brown-monkey-dumb 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node4 19.369s 2025-09-26 03:03:34.363 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+03+31.142041701Z_seq0_minr1_maxr501_orgn0.pces
node4 19.370s 2025-09-26 03:03:34.364 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+03+31.142041701Z_seq0_minr1_maxr501_orgn0.pces
node4 19.370s 2025-09-26 03:03:34.364 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 19.371s 2025-09-26 03:03:34.365 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 19.377s 2025-09-26 03:03:34.371 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 19.406s 2025-09-26 03:03:34.400 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+03+31.280166514Z_seq0_minr1_maxr501_orgn0.pces
node1 19.407s 2025-09-26 03:03:34.401 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+03+31.280166514Z_seq0_minr1_maxr501_orgn0.pces
node1 19.408s 2025-09-26 03:03:34.402 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 19.409s 2025-09-26 03:03:34.403 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 19.416s 2025-09-26 03:03:34.410 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 19.420s 2025-09-26 03:03:34.414 117 INFO PLATFORM_STATUS <platformForkJoinThread-7> DefaultStatusStateMachine: Platform spent 1.8 s in CHECKING. Now in ACTIVE
node1 19.763s 2025-09-26 03:03:34.757 119 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 1.3 s in CHECKING. Now in ACTIVE
node1 46.800s 2025-09-26 03:04:01.794 765 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 62 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 46.862s 2025-09-26 03:04:01.856 771 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 62 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 46.863s 2025-09-26 03:04:01.857 759 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 62 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 46.886s 2025-09-26 03:04:01.880 775 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 62 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 46.892s 2025-09-26 03:04:01.886 753 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 62 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 47.031s 2025-09-26 03:04:02.025 781 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 62 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/62
node2 47.031s 2025-09-26 03:04:02.025 782 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 62
node4 47.043s 2025-09-26 03:04:02.037 756 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 62 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/62
node4 47.044s 2025-09-26 03:04:02.038 757 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 62
node3 47.054s 2025-09-26 03:04:02.048 777 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 62 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/62
node3 47.055s 2025-09-26 03:04:02.049 778 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 62
node1 47.079s 2025-09-26 03:04:02.073 779 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 62 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/62
node1 47.080s 2025-09-26 03:04:02.074 781 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 62
node0 47.102s 2025-09-26 03:04:02.096 765 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 62 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/62
node0 47.103s 2025-09-26 03:04:02.097 766 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 62
node2 47.116s 2025-09-26 03:04:02.110 822 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 62
node2 47.118s 2025-09-26 03:04:02.112 823 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 62 Timestamp: 2025-09-26T03:04:00.408510889Z Next consensus number: 2169 Legacy running event hash: b87bdc95141bfb91889048176427c9829306296846994fb9caafaa453bb377b2848567e9dd799b34818c73f4b04bf760 Legacy running event mnemonic: vague-hope-lock-mistake Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -873558317 Root hash: 6057dbafa0398007c5176fcdd7ccee2909e2e1d2f73fd8cb6c532a53b9e182d25c5660823fcb9854f8f60f2806803ec3 (root) ConsistencyTestingToolState / april-mechanic-wedding-you 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 decorate-right-earth-record 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 8108526713557696734 /3 fancy-sound-already-royal 4 StringLeaf 62 /4 ignore-equal-practice-again
node2 47.128s 2025-09-26 03:04:02.122 824 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+03+31.391433828Z_seq0_minr1_maxr501_orgn0.pces
node2 47.128s 2025-09-26 03:04:02.122 825 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 35 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+03+31.391433828Z_seq0_minr1_maxr501_orgn0.pces
node2 47.128s 2025-09-26 03:04:02.122 826 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 47.130s 2025-09-26 03:04:02.124 827 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 47.131s 2025-09-26 03:04:02.125 828 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 62 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/62 {"round":62,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/62/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 47.136s 2025-09-26 03:04:02.130 796 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 62
node4 47.139s 2025-09-26 03:04:02.133 797 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 62 Timestamp: 2025-09-26T03:04:00.408510889Z Next consensus number: 2169 Legacy running event hash: b87bdc95141bfb91889048176427c9829306296846994fb9caafaa453bb377b2848567e9dd799b34818c73f4b04bf760 Legacy running event mnemonic: vague-hope-lock-mistake Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -873558317 Root hash: 6057dbafa0398007c5176fcdd7ccee2909e2e1d2f73fd8cb6c532a53b9e182d25c5660823fcb9854f8f60f2806803ec3 (root) ConsistencyTestingToolState / april-mechanic-wedding-you 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 decorate-right-earth-record 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 8108526713557696734 /3 fancy-sound-already-royal 4 StringLeaf 62 /4 ignore-equal-practice-again
node3 47.142s 2025-09-26 03:04:02.136 814 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 62
node3 47.144s 2025-09-26 03:04:02.138 815 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 62 Timestamp: 2025-09-26T03:04:00.408510889Z Next consensus number: 2169 Legacy running event hash: b87bdc95141bfb91889048176427c9829306296846994fb9caafaa453bb377b2848567e9dd799b34818c73f4b04bf760 Legacy running event mnemonic: vague-hope-lock-mistake Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -873558317 Root hash: 6057dbafa0398007c5176fcdd7ccee2909e2e1d2f73fd8cb6c532a53b9e182d25c5660823fcb9854f8f60f2806803ec3 (root) ConsistencyTestingToolState / april-mechanic-wedding-you 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 decorate-right-earth-record 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 8108526713557696734 /3 fancy-sound-already-royal 4 StringLeaf 62 /4 ignore-equal-practice-again
node4 47.148s 2025-09-26 03:04:02.142 798 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+03+31.142041701Z_seq0_minr1_maxr501_orgn0.pces
node4 47.149s 2025-09-26 03:04:02.143 799 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 35 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+03+31.142041701Z_seq0_minr1_maxr501_orgn0.pces
node4 47.149s 2025-09-26 03:04:02.143 800 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 47.151s 2025-09-26 03:04:02.145 801 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 47.152s 2025-09-26 03:04:02.146 816 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+03+31.256200715Z_seq0_minr1_maxr501_orgn0.pces
node4 47.152s 2025-09-26 03:04:02.146 802 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 62 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/62 {"round":62,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/62/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 47.153s 2025-09-26 03:04:02.147 817 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 35 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+03+31.256200715Z_seq0_minr1_maxr501_orgn0.pces
node3 47.153s 2025-09-26 03:04:02.147 818 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 47.155s 2025-09-26 03:04:02.149 819 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 47.155s 2025-09-26 03:04:02.149 820 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 62 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/62 {"round":62,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/62/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 47.170s 2025-09-26 03:04:02.164 818 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 62
node1 47.173s 2025-09-26 03:04:02.167 819 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 62 Timestamp: 2025-09-26T03:04:00.408510889Z Next consensus number: 2169 Legacy running event hash: b87bdc95141bfb91889048176427c9829306296846994fb9caafaa453bb377b2848567e9dd799b34818c73f4b04bf760 Legacy running event mnemonic: vague-hope-lock-mistake Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -873558317 Root hash: 6057dbafa0398007c5176fcdd7ccee2909e2e1d2f73fd8cb6c532a53b9e182d25c5660823fcb9854f8f60f2806803ec3 (root) ConsistencyTestingToolState / april-mechanic-wedding-you 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 decorate-right-earth-record 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 8108526713557696734 /3 fancy-sound-already-royal 4 StringLeaf 62 /4 ignore-equal-practice-again
node1 47.181s 2025-09-26 03:04:02.175 820 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+03+31.280166514Z_seq0_minr1_maxr501_orgn0.pces
node1 47.181s 2025-09-26 03:04:02.175 821 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 35 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+03+31.280166514Z_seq0_minr1_maxr501_orgn0.pces
node1 47.181s 2025-09-26 03:04:02.175 822 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 47.183s 2025-09-26 03:04:02.177 823 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 47.184s 2025-09-26 03:04:02.178 824 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 62 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/62 {"round":62,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/62/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 47.198s 2025-09-26 03:04:02.192 802 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 62
node0 47.200s 2025-09-26 03:04:02.194 803 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 62 Timestamp: 2025-09-26T03:04:00.408510889Z Next consensus number: 2169 Legacy running event hash: b87bdc95141bfb91889048176427c9829306296846994fb9caafaa453bb377b2848567e9dd799b34818c73f4b04bf760 Legacy running event mnemonic: vague-hope-lock-mistake Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -873558317 Root hash: 6057dbafa0398007c5176fcdd7ccee2909e2e1d2f73fd8cb6c532a53b9e182d25c5660823fcb9854f8f60f2806803ec3 (root) ConsistencyTestingToolState / april-mechanic-wedding-you 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 decorate-right-earth-record 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 8108526713557696734 /3 fancy-sound-already-royal 4 StringLeaf 62 /4 ignore-equal-practice-again
node0 47.210s 2025-09-26 03:04:02.204 804 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+03+31.273927897Z_seq0_minr1_maxr501_orgn0.pces
node0 47.211s 2025-09-26 03:04:02.205 805 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 35 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+03+31.273927897Z_seq0_minr1_maxr501_orgn0.pces
node0 47.211s 2025-09-26 03:04:02.205 806 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 47.213s 2025-09-26 03:04:02.207 807 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 47.213s 2025-09-26 03:04:02.207 808 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 62 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/62 {"round":62,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/62/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 1m 45.985s 2025-09-26 03:05:00.979 2312 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 197 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 1m 46.001s 2025-09-26 03:05:00.995 2274 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 197 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 1m 46.077s 2025-09-26 03:05:01.071 2288 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 197 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 1m 46.110s 2025-09-26 03:05:01.104 2296 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 197 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 1m 46.198s 2025-09-26 03:05:01.192 2306 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 197 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 1m 46.248s 2025-09-26 03:05:01.242 2309 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 197 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/197
node1 1m 46.249s 2025-09-26 03:05:01.243 2310 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 197
node3 1m 46.274s 2025-09-26 03:05:01.268 2291 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 197 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/197
node3 1m 46.275s 2025-09-26 03:05:01.269 2292 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 197
node1 1m 46.342s 2025-09-26 03:05:01.336 2341 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 197
node1 1m 46.345s 2025-09-26 03:05:01.339 2342 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 197 Timestamp: 2025-09-26T03:05:00.097946Z Next consensus number: 6952 Legacy running event hash: 2a36bedc3089c6dc99b9f508791ace38a57ee3c16ee7d76c026728279a1ddfe2a81f8ef5249730cfc565d2ad48935c89 Legacy running event mnemonic: egg-cause-bachelor-cause Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 645309195 Root hash: 544fe05d10debd5f91516528929d3da1c550056b7bfd5afa52625328afb6992769e42089d8bb9f533f9fcc61ef6fda21 (root) ConsistencyTestingToolState / measure-virtual-bridge-feature 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 speed-oblige-anger-fence 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 2082384807159715715 /3 sugar-decorate-snow-capable 4 StringLeaf 197 /4 rich-piano-actress-victory
node1 1m 46.354s 2025-09-26 03:05:01.348 2343 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+03+31.280166514Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 46.354s 2025-09-26 03:05:01.348 2344 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 170 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+03+31.280166514Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 46.354s 2025-09-26 03:05:01.348 2345 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 1m 46.355s 2025-09-26 03:05:01.349 2323 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 197
node3 1m 46.357s 2025-09-26 03:05:01.351 2324 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 197 Timestamp: 2025-09-26T03:05:00.097946Z Next consensus number: 6952 Legacy running event hash: 2a36bedc3089c6dc99b9f508791ace38a57ee3c16ee7d76c026728279a1ddfe2a81f8ef5249730cfc565d2ad48935c89 Legacy running event mnemonic: egg-cause-bachelor-cause Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 645309195 Root hash: 544fe05d10debd5f91516528929d3da1c550056b7bfd5afa52625328afb6992769e42089d8bb9f533f9fcc61ef6fda21 (root) ConsistencyTestingToolState / measure-virtual-bridge-feature 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 speed-oblige-anger-fence 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 2082384807159715715 /3 sugar-decorate-snow-capable 4 StringLeaf 197 /4 rich-piano-actress-victory
node1 1m 46.359s 2025-09-26 03:05:01.353 2346 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 1m 46.360s 2025-09-26 03:05:01.354 2347 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 197 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/197 {"round":197,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/197/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 1m 46.367s 2025-09-26 03:05:01.361 2325 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+03+31.256200715Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 46.367s 2025-09-26 03:05:01.361 2326 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 170 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+03+31.256200715Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 46.368s 2025-09-26 03:05:01.362 2327 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 1m 46.373s 2025-09-26 03:05:01.367 2328 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 1m 46.373s 2025-09-26 03:05:01.367 2329 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 197 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/197 {"round":197,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/197/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 1m 46.403s 2025-09-26 03:05:01.397 2299 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 197 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/197
node4 1m 46.404s 2025-09-26 03:05:01.398 2300 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 197
node2 1m 46.407s 2025-09-26 03:05:01.401 2331 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 197 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/197
node2 1m 46.408s 2025-09-26 03:05:01.402 2332 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 197
node0 1m 46.424s 2025-09-26 03:05:01.418 2293 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 197 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/197
node0 1m 46.425s 2025-09-26 03:05:01.419 2294 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 197
node4 1m 46.480s 2025-09-26 03:05:01.474 2335 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 197
node4 1m 46.482s 2025-09-26 03:05:01.476 2336 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 197 Timestamp: 2025-09-26T03:05:00.097946Z Next consensus number: 6952 Legacy running event hash: 2a36bedc3089c6dc99b9f508791ace38a57ee3c16ee7d76c026728279a1ddfe2a81f8ef5249730cfc565d2ad48935c89 Legacy running event mnemonic: egg-cause-bachelor-cause Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 645309195 Root hash: 544fe05d10debd5f91516528929d3da1c550056b7bfd5afa52625328afb6992769e42089d8bb9f533f9fcc61ef6fda21 (root) ConsistencyTestingToolState / measure-virtual-bridge-feature 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 speed-oblige-anger-fence 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 2082384807159715715 /3 sugar-decorate-snow-capable 4 StringLeaf 197 /4 rich-piano-actress-victory
node4 1m 46.491s 2025-09-26 03:05:01.485 2337 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+03+31.142041701Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 46.491s 2025-09-26 03:05:01.485 2338 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 170 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+03+31.142041701Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 46.491s 2025-09-26 03:05:01.485 2339 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 1m 46.494s 2025-09-26 03:05:01.488 2375 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 197
node2 1m 46.496s 2025-09-26 03:05:01.490 2376 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 197 Timestamp: 2025-09-26T03:05:00.097946Z Next consensus number: 6952 Legacy running event hash: 2a36bedc3089c6dc99b9f508791ace38a57ee3c16ee7d76c026728279a1ddfe2a81f8ef5249730cfc565d2ad48935c89 Legacy running event mnemonic: egg-cause-bachelor-cause Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 645309195 Root hash: 544fe05d10debd5f91516528929d3da1c550056b7bfd5afa52625328afb6992769e42089d8bb9f533f9fcc61ef6fda21 (root) ConsistencyTestingToolState / measure-virtual-bridge-feature 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 speed-oblige-anger-fence 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 2082384807159715715 /3 sugar-decorate-snow-capable 4 StringLeaf 197 /4 rich-piano-actress-victory
node4 1m 46.496s 2025-09-26 03:05:01.490 2340 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 1m 46.497s 2025-09-26 03:05:01.491 2341 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 197 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/197 {"round":197,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/197/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 1m 46.503s 2025-09-26 03:05:01.497 2377 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+03+31.391433828Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 46.503s 2025-09-26 03:05:01.497 2378 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 170 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+03+31.391433828Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 46.503s 2025-09-26 03:05:01.497 2379 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 1m 46.508s 2025-09-26 03:05:01.502 2380 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 1m 46.508s 2025-09-26 03:05:01.502 2381 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 197 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/197 {"round":197,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/197/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 1m 46.520s 2025-09-26 03:05:01.514 2337 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 197
node0 1m 46.522s 2025-09-26 03:05:01.516 2338 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 197 Timestamp: 2025-09-26T03:05:00.097946Z Next consensus number: 6952 Legacy running event hash: 2a36bedc3089c6dc99b9f508791ace38a57ee3c16ee7d76c026728279a1ddfe2a81f8ef5249730cfc565d2ad48935c89 Legacy running event mnemonic: egg-cause-bachelor-cause Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 645309195 Root hash: 544fe05d10debd5f91516528929d3da1c550056b7bfd5afa52625328afb6992769e42089d8bb9f533f9fcc61ef6fda21 (root) ConsistencyTestingToolState / measure-virtual-bridge-feature 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 speed-oblige-anger-fence 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 2082384807159715715 /3 sugar-decorate-snow-capable 4 StringLeaf 197 /4 rich-piano-actress-victory
node0 1m 46.530s 2025-09-26 03:05:01.524 2339 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+03+31.273927897Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 46.530s 2025-09-26 03:05:01.524 2340 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 170 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+03+31.273927897Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 46.530s 2025-09-26 03:05:01.524 2341 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 1m 46.535s 2025-09-26 03:05:01.529 2342 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 1m 46.536s 2025-09-26 03:05:01.530 2343 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 197 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/197 {"round":197,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/197/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 2m 46.135s 2025-09-26 03:06:01.129 3879 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 334 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 2m 46.141s 2025-09-26 03:06:01.135 3829 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 334 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 2m 46.148s 2025-09-26 03:06:01.142 3847 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 334 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 2m 46.190s 2025-09-26 03:06:01.184 3823 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 334 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 2m 46.192s 2025-09-26 03:06:01.186 3825 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 334 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 2m 46.349s 2025-09-26 03:06:01.343 3882 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 334 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/334
node2 2m 46.350s 2025-09-26 03:06:01.344 3883 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 334
node4 2m 46.375s 2025-09-26 03:06:01.369 3828 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 334 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/334
node4 2m 46.376s 2025-09-26 03:06:01.370 3829 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 334
node3 2m 46.385s 2025-09-26 03:06:01.379 3826 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 334 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/334
node3 2m 46.386s 2025-09-26 03:06:01.380 3827 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 334
node0 2m 46.391s 2025-09-26 03:06:01.385 3832 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 334 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/334
node0 2m 46.392s 2025-09-26 03:06:01.386 3833 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 334
node2 2m 46.442s 2025-09-26 03:06:01.436 3914 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 334
node2 2m 46.444s 2025-09-26 03:06:01.438 3915 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 334 Timestamp: 2025-09-26T03:06:00.152761Z Next consensus number: 11771 Legacy running event hash: 5b7614730a597f7a27f273a91bfe25027d6d839ba6c4bafa68c60a50aaf6e4c81bcbb6e6011b007d6eb4c526c58e084f Legacy running event mnemonic: human-mobile-enroll-urban Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1962935126 Root hash: 58ba5a38bf480a34644d16e867cc572408ee2fdd28663d3f6b66234121c442fcccbc2a9d639aae80e751c342933437cc (root) ConsistencyTestingToolState / blanket-either-blade-educate 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 hospital-quantum-congress-enroll 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 575583921648146648 /3 muscle-narrow-doctor-number 4 StringLeaf 334 /4 feel-smile-tourist-patch
node2 2m 46.451s 2025-09-26 03:06:01.445 3916 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+03+31.391433828Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 46.452s 2025-09-26 03:06:01.446 3917 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 306 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+03+31.391433828Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 46.452s 2025-09-26 03:06:01.446 3918 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 2m 46.454s 2025-09-26 03:06:01.448 3864 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 334
node4 2m 46.456s 2025-09-26 03:06:01.450 3865 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 334 Timestamp: 2025-09-26T03:06:00.152761Z Next consensus number: 11771 Legacy running event hash: 5b7614730a597f7a27f273a91bfe25027d6d839ba6c4bafa68c60a50aaf6e4c81bcbb6e6011b007d6eb4c526c58e084f Legacy running event mnemonic: human-mobile-enroll-urban Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1962935126 Root hash: 58ba5a38bf480a34644d16e867cc572408ee2fdd28663d3f6b66234121c442fcccbc2a9d639aae80e751c342933437cc (root) ConsistencyTestingToolState / blanket-either-blade-educate 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 hospital-quantum-congress-enroll 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 575583921648146648 /3 muscle-narrow-doctor-number 4 StringLeaf 334 /4 feel-smile-tourist-patch
node2 2m 46.460s 2025-09-26 03:06:01.454 3927 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 2m 46.461s 2025-09-26 03:06:01.455 3850 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 334 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/334
node2 2m 46.461s 2025-09-26 03:06:01.455 3928 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 334 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/334 {"round":334,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/334/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 2m 46.462s 2025-09-26 03:06:01.456 3851 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 334
node4 2m 46.464s 2025-09-26 03:06:01.458 3866 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+03+31.142041701Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 46.465s 2025-09-26 03:06:01.459 3867 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 306 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+03+31.142041701Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 46.465s 2025-09-26 03:06:01.459 3868 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 2m 46.477s 2025-09-26 03:06:01.471 3869 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 2m 46.477s 2025-09-26 03:06:01.471 3870 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 334 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/334 {"round":334,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/334/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 2m 46.478s 2025-09-26 03:06:01.472 3864 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 334
node3 2m 46.479s 2025-09-26 03:06:01.473 3862 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 334
node0 2m 46.480s 2025-09-26 03:06:01.474 3865 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 334 Timestamp: 2025-09-26T03:06:00.152761Z Next consensus number: 11771 Legacy running event hash: 5b7614730a597f7a27f273a91bfe25027d6d839ba6c4bafa68c60a50aaf6e4c81bcbb6e6011b007d6eb4c526c58e084f Legacy running event mnemonic: human-mobile-enroll-urban Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1962935126 Root hash: 58ba5a38bf480a34644d16e867cc572408ee2fdd28663d3f6b66234121c442fcccbc2a9d639aae80e751c342933437cc (root) ConsistencyTestingToolState / blanket-either-blade-educate 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 hospital-quantum-congress-enroll 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 575583921648146648 /3 muscle-narrow-doctor-number 4 StringLeaf 334 /4 feel-smile-tourist-patch
node3 2m 46.481s 2025-09-26 03:06:01.475 3863 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 334 Timestamp: 2025-09-26T03:06:00.152761Z Next consensus number: 11771 Legacy running event hash: 5b7614730a597f7a27f273a91bfe25027d6d839ba6c4bafa68c60a50aaf6e4c81bcbb6e6011b007d6eb4c526c58e084f Legacy running event mnemonic: human-mobile-enroll-urban Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1962935126 Root hash: 58ba5a38bf480a34644d16e867cc572408ee2fdd28663d3f6b66234121c442fcccbc2a9d639aae80e751c342933437cc (root) ConsistencyTestingToolState / blanket-either-blade-educate 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 hospital-quantum-congress-enroll 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 575583921648146648 /3 muscle-narrow-doctor-number 4 StringLeaf 334 /4 feel-smile-tourist-patch
node0 2m 46.487s 2025-09-26 03:06:01.481 3874 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+03+31.273927897Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 46.487s 2025-09-26 03:06:01.481 3875 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 306 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+03+31.273927897Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 46.487s 2025-09-26 03:06:01.481 3876 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 2m 46.489s 2025-09-26 03:06:01.483 3864 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+03+31.256200715Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 46.490s 2025-09-26 03:06:01.484 3865 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 306 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+03+31.256200715Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 46.490s 2025-09-26 03:06:01.484 3866 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 2m 46.495s 2025-09-26 03:06:01.489 3877 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 2m 46.496s 2025-09-26 03:06:01.490 3878 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 334 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/334 {"round":334,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/334/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 2m 46.498s 2025-09-26 03:06:01.492 3867 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 2m 46.499s 2025-09-26 03:06:01.493 3868 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 334 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/334 {"round":334,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/334/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 2m 46.557s 2025-09-26 03:06:01.551 3882 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 334
node1 2m 46.559s 2025-09-26 03:06:01.553 3883 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 334 Timestamp: 2025-09-26T03:06:00.152761Z Next consensus number: 11771 Legacy running event hash: 5b7614730a597f7a27f273a91bfe25027d6d839ba6c4bafa68c60a50aaf6e4c81bcbb6e6011b007d6eb4c526c58e084f Legacy running event mnemonic: human-mobile-enroll-urban Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1962935126 Root hash: 58ba5a38bf480a34644d16e867cc572408ee2fdd28663d3f6b66234121c442fcccbc2a9d639aae80e751c342933437cc (root) ConsistencyTestingToolState / blanket-either-blade-educate 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 hospital-quantum-congress-enroll 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 575583921648146648 /3 muscle-narrow-doctor-number 4 StringLeaf 334 /4 feel-smile-tourist-patch
node1 2m 46.566s 2025-09-26 03:06:01.560 3884 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+03+31.280166514Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 46.567s 2025-09-26 03:06:01.561 3885 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 306 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+03+31.280166514Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 46.567s 2025-09-26 03:06:01.561 3886 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 2m 46.577s 2025-09-26 03:06:01.571 3887 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 2m 46.577s 2025-09-26 03:06:01.571 3888 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 334 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/334 {"round":334,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/334/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 3m 46.030s 2025-09-26 03:07:01.024 5390 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 471 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 3m 46.034s 2025-09-26 03:07:01.028 5430 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 471 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 3m 46.047s 2025-09-26 03:07:01.041 5432 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 471 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 3m 46.062s 2025-09-26 03:07:01.056 5406 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 471 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 3m 46.235s 2025-09-26 03:07:01.229 5435 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 471 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/471
node1 3m 46.236s 2025-09-26 03:07:01.230 5436 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 471
node2 3m 46.249s 2025-09-26 03:07:01.243 5433 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 471 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/471
node2 3m 46.250s 2025-09-26 03:07:01.244 5434 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 471
node3 3m 46.306s 2025-09-26 03:07:01.300 5393 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 471 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/471
node3 3m 46.307s 2025-09-26 03:07:01.301 5394 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 471
node0 3m 46.308s 2025-09-26 03:07:01.302 5409 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 471 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/471
node0 3m 46.309s 2025-09-26 03:07:01.303 5410 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 471
node1 3m 46.325s 2025-09-26 03:07:01.319 5467 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 471
node1 3m 46.327s 2025-09-26 03:07:01.321 5468 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 471 Timestamp: 2025-09-26T03:07:00.165883Z Next consensus number: 15749 Legacy running event hash: 2bc5df2a5db05de2122f2ea40b0e1ef865ab710e11e53d93df84896f63be74a352a71b6b2fc9b5b5fde51932ba592871 Legacy running event mnemonic: judge-bracket-render-rough Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1771446969 Root hash: 50d422be15ad7a0bc59c5cbe1dc27beb5260b8a9287ed481b29845a9ee5a050a619a55b16b124e820d300dd0a1529682 (root) ConsistencyTestingToolState / box-fit-remember-door 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 economy-squirrel-human-scrub 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -953556001017278104 /3 search-sweet-intact-celery 4 StringLeaf 471 /4 run-absorb-oven-lawsuit
node1 3m 46.335s 2025-09-26 03:07:01.329 5469 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+03+31.280166514Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 46.335s 2025-09-26 03:07:01.329 5470 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 444 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+03+31.280166514Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 46.335s 2025-09-26 03:07:01.329 5471 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 3m 46.338s 2025-09-26 03:07:01.332 5469 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 471
node2 3m 46.340s 2025-09-26 03:07:01.334 5470 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 471 Timestamp: 2025-09-26T03:07:00.165883Z Next consensus number: 15749 Legacy running event hash: 2bc5df2a5db05de2122f2ea40b0e1ef865ab710e11e53d93df84896f63be74a352a71b6b2fc9b5b5fde51932ba592871 Legacy running event mnemonic: judge-bracket-render-rough Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1771446969 Root hash: 50d422be15ad7a0bc59c5cbe1dc27beb5260b8a9287ed481b29845a9ee5a050a619a55b16b124e820d300dd0a1529682 (root) ConsistencyTestingToolState / box-fit-remember-door 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 economy-squirrel-human-scrub 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -953556001017278104 /3 search-sweet-intact-celery 4 StringLeaf 471 /4 run-absorb-oven-lawsuit
node1 3m 46.346s 2025-09-26 03:07:01.340 5472 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 3m 46.346s 2025-09-26 03:07:01.340 5473 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 471 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/471 {"round":471,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/471/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 3m 46.348s 2025-09-26 03:07:01.342 5471 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+03+31.391433828Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 46.348s 2025-09-26 03:07:01.342 5472 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 444 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+03+31.391433828Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 46.348s 2025-09-26 03:07:01.342 5473 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 3m 46.359s 2025-09-26 03:07:01.353 5474 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 3m 46.360s 2025-09-26 03:07:01.354 5475 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 471 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/471 {"round":471,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/471/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 3m 46.396s 2025-09-26 03:07:01.390 5425 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 471
node3 3m 46.398s 2025-09-26 03:07:01.392 5426 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 471 Timestamp: 2025-09-26T03:07:00.165883Z Next consensus number: 15749 Legacy running event hash: 2bc5df2a5db05de2122f2ea40b0e1ef865ab710e11e53d93df84896f63be74a352a71b6b2fc9b5b5fde51932ba592871 Legacy running event mnemonic: judge-bracket-render-rough Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1771446969 Root hash: 50d422be15ad7a0bc59c5cbe1dc27beb5260b8a9287ed481b29845a9ee5a050a619a55b16b124e820d300dd0a1529682 (root) ConsistencyTestingToolState / box-fit-remember-door 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 economy-squirrel-human-scrub 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -953556001017278104 /3 search-sweet-intact-celery 4 StringLeaf 471 /4 run-absorb-oven-lawsuit
node0 3m 46.401s 2025-09-26 03:07:01.395 5441 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 471
node0 3m 46.403s 2025-09-26 03:07:01.397 5442 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 471 Timestamp: 2025-09-26T03:07:00.165883Z Next consensus number: 15749 Legacy running event hash: 2bc5df2a5db05de2122f2ea40b0e1ef865ab710e11e53d93df84896f63be74a352a71b6b2fc9b5b5fde51932ba592871 Legacy running event mnemonic: judge-bracket-render-rough Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1771446969 Root hash: 50d422be15ad7a0bc59c5cbe1dc27beb5260b8a9287ed481b29845a9ee5a050a619a55b16b124e820d300dd0a1529682 (root) ConsistencyTestingToolState / box-fit-remember-door 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 economy-squirrel-human-scrub 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -953556001017278104 /3 search-sweet-intact-celery 4 StringLeaf 471 /4 run-absorb-oven-lawsuit
node3 3m 46.408s 2025-09-26 03:07:01.402 5435 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+03+31.256200715Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 46.408s 2025-09-26 03:07:01.402 5436 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 444 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+03+31.256200715Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 46.408s 2025-09-26 03:07:01.402 5437 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 3m 46.411s 2025-09-26 03:07:01.405 5443 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+03+31.273927897Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 46.412s 2025-09-26 03:07:01.406 5444 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 444 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+03+31.273927897Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 46.412s 2025-09-26 03:07:01.406 5445 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 3m 46.419s 2025-09-26 03:07:01.413 5438 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 3m 46.419s 2025-09-26 03:07:01.413 5439 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 471 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/471 {"round":471,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/471/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 3m 46.423s 2025-09-26 03:07:01.417 5446 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 3m 46.424s 2025-09-26 03:07:01.418 5447 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 471 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/471 {"round":471,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/471/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 4m 46.290s 2025-09-26 03:08:01.284 7013 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 610 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 4m 46.334s 2025-09-26 03:08:01.328 7005 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 610 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 4m 46.349s 2025-09-26 03:08:01.343 7001 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 610 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 4m 46.351s 2025-09-26 03:08:01.345 6977 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 610 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 4m 46.496s 2025-09-26 03:08:01.490 6980 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 610 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/610
node0 4m 46.497s 2025-09-26 03:08:01.491 6981 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 610
node2 4m 46.555s 2025-09-26 03:08:01.549 7008 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 610 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/610
node2 4m 46.556s 2025-09-26 03:08:01.550 7009 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 610
node1 4m 46.562s 2025-09-26 03:08:01.556 7004 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 610 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/610
node1 4m 46.563s 2025-09-26 03:08:01.557 7005 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 610
node3 4m 46.567s 2025-09-26 03:08:01.561 7016 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 610 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/610
node3 4m 46.568s 2025-09-26 03:08:01.562 7017 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 610
node0 4m 46.587s 2025-09-26 03:08:01.581 7012 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 610
node0 4m 46.590s 2025-09-26 03:08:01.584 7013 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 610 Timestamp: 2025-09-26T03:08:00.368181Z Next consensus number: 19018 Legacy running event hash: 2ed33d65bcdb7b1931d60a1c8267ac0b42d099618c7c1670d7b6833f547d7962d3410f6f22e3319e347ccc2425ff155f Legacy running event mnemonic: surprise-protect-gorilla-glimpse Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 764282614 Root hash: 41f299b50ac40c8b3fb9f3257db7d892012924396df2b1e10a47e99ac15c068a3acf6f86c5026995253ab48e007893f0 (root) ConsistencyTestingToolState / dinner-siren-flat-multiply 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 soft-what-evil-canvas 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -730998593439248566 /3 leaf-put-ring-fun 4 StringLeaf 610 /4 present-review-dinner-rug
node0 4m 46.597s 2025-09-26 03:08:01.591 7014 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+07+14.082954667Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+03+31.273927897Z_seq0_minr1_maxr501_orgn0.pces
node0 4m 46.597s 2025-09-26 03:08:01.591 7015 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 583 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+07+14.082954667Z_seq1_minr474_maxr5474_orgn0.pces
node0 4m 46.597s 2025-09-26 03:08:01.591 7016 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 4m 46.599s 2025-09-26 03:08:01.593 7017 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 4m 46.599s 2025-09-26 03:08:01.593 7018 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 610 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/610 {"round":610,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/610/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 4m 46.601s 2025-09-26 03:08:01.595 7019 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1
node2 4m 46.646s 2025-09-26 03:08:01.640 7040 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 610
node2 4m 46.648s 2025-09-26 03:08:01.642 7041 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 610 Timestamp: 2025-09-26T03:08:00.368181Z Next consensus number: 19018 Legacy running event hash: 2ed33d65bcdb7b1931d60a1c8267ac0b42d099618c7c1670d7b6833f547d7962d3410f6f22e3319e347ccc2425ff155f Legacy running event mnemonic: surprise-protect-gorilla-glimpse Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 764282614 Root hash: 41f299b50ac40c8b3fb9f3257db7d892012924396df2b1e10a47e99ac15c068a3acf6f86c5026995253ab48e007893f0 (root) ConsistencyTestingToolState / dinner-siren-flat-multiply 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 soft-what-evil-canvas 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -730998593439248566 /3 leaf-put-ring-fun 4 StringLeaf 610 /4 present-review-dinner-rug
node3 4m 46.654s 2025-09-26 03:08:01.648 7052 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 610
node2 4m 46.655s 2025-09-26 03:08:01.649 7042 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+03+31.391433828Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+07+14.096277183Z_seq1_minr474_maxr5474_orgn0.pces
node2 4m 46.655s 2025-09-26 03:08:01.649 7043 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 583 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+07+14.096277183Z_seq1_minr474_maxr5474_orgn0.pces
node2 4m 46.655s 2025-09-26 03:08:01.649 7044 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 4m 46.656s 2025-09-26 03:08:01.650 7053 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 610 Timestamp: 2025-09-26T03:08:00.368181Z Next consensus number: 19018 Legacy running event hash: 2ed33d65bcdb7b1931d60a1c8267ac0b42d099618c7c1670d7b6833f547d7962d3410f6f22e3319e347ccc2425ff155f Legacy running event mnemonic: surprise-protect-gorilla-glimpse Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 764282614 Root hash: 41f299b50ac40c8b3fb9f3257db7d892012924396df2b1e10a47e99ac15c068a3acf6f86c5026995253ab48e007893f0 (root) ConsistencyTestingToolState / dinner-siren-flat-multiply 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 soft-what-evil-canvas 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -730998593439248566 /3 leaf-put-ring-fun 4 StringLeaf 610 /4 present-review-dinner-rug
node2 4m 46.657s 2025-09-26 03:08:01.651 7045 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 4m 46.658s 2025-09-26 03:08:01.652 7047 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 610 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/610 {"round":610,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/610/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 4m 46.660s 2025-09-26 03:08:01.654 7055 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1
node3 4m 46.663s 2025-09-26 03:08:01.657 7054 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+03+31.256200715Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+07+14.042388201Z_seq1_minr474_maxr5474_orgn0.pces
node3 4m 46.664s 2025-09-26 03:08:01.658 7055 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 583 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+07+14.042388201Z_seq1_minr474_maxr5474_orgn0.pces
node3 4m 46.664s 2025-09-26 03:08:01.658 7056 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 4m 46.666s 2025-09-26 03:08:01.660 7057 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 4m 46.666s 2025-09-26 03:08:01.660 7058 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 610 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/610 {"round":610,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/610/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 4m 46.668s 2025-09-26 03:08:01.662 7059 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1
node1 4m 46.670s 2025-09-26 03:08:01.664 7040 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 610
node1 4m 46.672s 2025-09-26 03:08:01.666 7041 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 610 Timestamp: 2025-09-26T03:08:00.368181Z Next consensus number: 19018 Legacy running event hash: 2ed33d65bcdb7b1931d60a1c8267ac0b42d099618c7c1670d7b6833f547d7962d3410f6f22e3319e347ccc2425ff155f Legacy running event mnemonic: surprise-protect-gorilla-glimpse Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 764282614 Root hash: 41f299b50ac40c8b3fb9f3257db7d892012924396df2b1e10a47e99ac15c068a3acf6f86c5026995253ab48e007893f0 (root) ConsistencyTestingToolState / dinner-siren-flat-multiply 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 soft-what-evil-canvas 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -730998593439248566 /3 leaf-put-ring-fun 4 StringLeaf 610 /4 present-review-dinner-rug
node1 4m 46.679s 2025-09-26 03:08:01.673 7042 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+07+14.110613394Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+03+31.280166514Z_seq0_minr1_maxr501_orgn0.pces
node1 4m 46.679s 2025-09-26 03:08:01.673 7043 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 583 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+07+14.110613394Z_seq1_minr474_maxr5474_orgn0.pces
node1 4m 46.679s 2025-09-26 03:08:01.673 7044 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 4m 46.681s 2025-09-26 03:08:01.675 7045 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 4m 46.682s 2025-09-26 03:08:01.676 7046 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 610 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/610 {"round":610,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/610/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 4m 46.683s 2025-09-26 03:08:01.677 7047 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1
node1 5m 46.228s 2025-09-26 03:09:01.222 8572 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 748 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 5m 46.231s 2025-09-26 03:09:01.225 8538 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 748 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 5m 46.428s 2025-09-26 03:09:01.422 8572 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 748 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 5m 46.468s 2025-09-26 03:09:01.462 8678 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 748 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 5m 46.494s 2025-09-26 03:09:01.488 8575 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 748 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/748
node2 5m 46.495s 2025-09-26 03:09:01.489 8576 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 748
node3 5m 46.540s 2025-09-26 03:09:01.534 8682 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 748 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/748
node3 5m 46.541s 2025-09-26 03:09:01.535 8685 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 748
node2 5m 46.582s 2025-09-26 03:09:01.576 8607 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 748
node2 5m 46.584s 2025-09-26 03:09:01.578 8608 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 748 Timestamp: 2025-09-26T03:09:00.345156Z Next consensus number: 22262 Legacy running event hash: d9156d4f012daf88f0199beb29e96774a87ac8df23351236f737a669b5066d137ef40ae79c2b43152d0c9db9438fa17d Legacy running event mnemonic: talent-prepare-certain-audit Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -531838707 Root hash: bec60ad8a5626ccad8c59f62da7764d3c322ee83c40e56a9475d22ec541437cb7da5f117357341a312cad34ae0869e70 (root) ConsistencyTestingToolState / chef-force-fatigue-off 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 silk-ostrich-question-attract 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -7696481669779061704 /3 warrior-spread-explain-submit 4 StringLeaf 748 /4 lazy-call-domain-cruel
node2 5m 46.590s 2025-09-26 03:09:01.584 8609 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+03+31.391433828Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+07+14.096277183Z_seq1_minr474_maxr5474_orgn0.pces
node2 5m 46.591s 2025-09-26 03:09:01.585 8610 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 721 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+07+14.096277183Z_seq1_minr474_maxr5474_orgn0.pces
node2 5m 46.591s 2025-09-26 03:09:01.585 8611 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 5m 46.595s 2025-09-26 03:09:01.589 8612 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 5m 46.595s 2025-09-26 03:09:01.589 8613 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 748 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/748 {"round":748,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/748/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 5m 46.597s 2025-09-26 03:09:01.591 8614 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/62
node3 5m 46.624s 2025-09-26 03:09:01.618 8725 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 748
node3 5m 46.626s 2025-09-26 03:09:01.620 8726 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 748 Timestamp: 2025-09-26T03:09:00.345156Z Next consensus number: 22262 Legacy running event hash: d9156d4f012daf88f0199beb29e96774a87ac8df23351236f737a669b5066d137ef40ae79c2b43152d0c9db9438fa17d Legacy running event mnemonic: talent-prepare-certain-audit Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -531838707 Root hash: bec60ad8a5626ccad8c59f62da7764d3c322ee83c40e56a9475d22ec541437cb7da5f117357341a312cad34ae0869e70 (root) ConsistencyTestingToolState / chef-force-fatigue-off 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 silk-ostrich-question-attract 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -7696481669779061704 /3 warrior-spread-explain-submit 4 StringLeaf 748 /4 lazy-call-domain-cruel
node3 5m 46.633s 2025-09-26 03:09:01.627 8727 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+03+31.256200715Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+07+14.042388201Z_seq1_minr474_maxr5474_orgn0.pces
node3 5m 46.633s 2025-09-26 03:09:01.627 8728 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 721 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+07+14.042388201Z_seq1_minr474_maxr5474_orgn0.pces
node3 5m 46.633s 2025-09-26 03:09:01.627 8729 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 5m 46.637s 2025-09-26 03:09:01.631 8730 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 5m 46.638s 2025-09-26 03:09:01.632 8731 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 748 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/748 {"round":748,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/748/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 5m 46.639s 2025-09-26 03:09:01.633 8732 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/62
node1 5m 46.645s 2025-09-26 03:09:01.639 8575 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 748 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/748
node1 5m 46.646s 2025-09-26 03:09:01.640 8576 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 748
node0 5m 46.650s 2025-09-26 03:09:01.644 8541 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 748 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/748
node0 5m 46.651s 2025-09-26 03:09:01.645 8542 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 748
node0 5m 46.738s 2025-09-26 03:09:01.732 8585 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 748
node0 5m 46.740s 2025-09-26 03:09:01.734 8586 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 748 Timestamp: 2025-09-26T03:09:00.345156Z Next consensus number: 22262 Legacy running event hash: d9156d4f012daf88f0199beb29e96774a87ac8df23351236f737a669b5066d137ef40ae79c2b43152d0c9db9438fa17d Legacy running event mnemonic: talent-prepare-certain-audit Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -531838707 Root hash: bec60ad8a5626ccad8c59f62da7764d3c322ee83c40e56a9475d22ec541437cb7da5f117357341a312cad34ae0869e70 (root) ConsistencyTestingToolState / chef-force-fatigue-off 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 silk-ostrich-question-attract 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -7696481669779061704 /3 warrior-spread-explain-submit 4 StringLeaf 748 /4 lazy-call-domain-cruel
node1 5m 46.745s 2025-09-26 03:09:01.739 8615 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 748
node1 5m 46.746s 2025-09-26 03:09:01.740 8616 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 748 Timestamp: 2025-09-26T03:09:00.345156Z Next consensus number: 22262 Legacy running event hash: d9156d4f012daf88f0199beb29e96774a87ac8df23351236f737a669b5066d137ef40ae79c2b43152d0c9db9438fa17d Legacy running event mnemonic: talent-prepare-certain-audit Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -531838707 Root hash: bec60ad8a5626ccad8c59f62da7764d3c322ee83c40e56a9475d22ec541437cb7da5f117357341a312cad34ae0869e70 (root) ConsistencyTestingToolState / chef-force-fatigue-off 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 silk-ostrich-question-attract 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -7696481669779061704 /3 warrior-spread-explain-submit 4 StringLeaf 748 /4 lazy-call-domain-cruel
node0 5m 46.747s 2025-09-26 03:09:01.741 8587 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+07+14.082954667Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+03+31.273927897Z_seq0_minr1_maxr501_orgn0.pces
node0 5m 46.747s 2025-09-26 03:09:01.741 8588 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 721 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+07+14.082954667Z_seq1_minr474_maxr5474_orgn0.pces
node0 5m 46.747s 2025-09-26 03:09:01.741 8589 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 5m 46.752s 2025-09-26 03:09:01.746 8590 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 5m 46.752s 2025-09-26 03:09:01.746 8591 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 748 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/748 {"round":748,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/748/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 5m 46.754s 2025-09-26 03:09:01.748 8592 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/62
node1 5m 46.754s 2025-09-26 03:09:01.748 8617 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+07+14.110613394Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+03+31.280166514Z_seq0_minr1_maxr501_orgn0.pces
node1 5m 46.754s 2025-09-26 03:09:01.748 8618 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 721 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+07+14.110613394Z_seq1_minr474_maxr5474_orgn0.pces
node1 5m 46.754s 2025-09-26 03:09:01.748 8619 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 5m 46.759s 2025-09-26 03:09:01.753 8620 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 5m 46.759s 2025-09-26 03:09:01.753 8621 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 748 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/748 {"round":748,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/748/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 5m 46.761s 2025-09-26 03:09:01.755 8622 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/62
node4 5m 51.725s 2025-09-26 03:09:06.719 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 5m 51.813s 2025-09-26 03:09:06.807 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 5m 51.828s 2025-09-26 03:09:06.822 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 51.945s 2025-09-26 03:09:06.939 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 5m 51.952s 2025-09-26 03:09:06.946 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 5m 51.963s 2025-09-26 03:09:06.957 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 52.384s 2025-09-26 03:09:07.378 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 5m 52.385s 2025-09-26 03:09:07.379 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 5m 53.245s 2025-09-26 03:09:08.239 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 860ms
node4 5m 53.254s 2025-09-26 03:09:08.248 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 5m 53.260s 2025-09-26 03:09:08.254 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 53.314s 2025-09-26 03:09:08.308 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 5m 53.375s 2025-09-26 03:09:08.369 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 5m 53.376s 2025-09-26 03:09:08.370 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 5m 55.376s 2025-09-26 03:09:10.370 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 5m 55.463s 2025-09-26 03:09:10.457 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 55.470s 2025-09-26 03:09:10.464 21 INFO STARTUP <main> StartupStateUtils: The following saved states were found on disk:
- /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/334/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/197/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/62/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1/SignedState.swh
node4 5m 55.471s 2025-09-26 03:09:10.465 22 INFO STARTUP <main> StartupStateUtils: Loading latest state from disk.
node4 5m 55.471s 2025-09-26 03:09:10.465 23 INFO STARTUP <main> StartupStateUtils: Loading signed state from disk: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/334/SignedState.swh
node4 5m 55.475s 2025-09-26 03:09:10.469 24 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 55.479s 2025-09-26 03:09:10.473 25 INFO STATE_TO_DISK <main> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp
node4 5m 55.605s 2025-09-26 03:09:10.599 36 INFO STARTUP <main> StartupStateUtils: Loaded state's hash is the same as when it was saved.
node4 5m 55.609s 2025-09-26 03:09:10.603 37 INFO STARTUP <main> StartupStateUtils: Platform has loaded a saved state {"round":334,"consensusTimestamp":"2025-09-26T03:06:00.152761Z"} [com.swirlds.logging.legacy.payload.SavedStateLoadedPayload]
node4 5m 55.612s 2025-09-26 03:09:10.606 40 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 55.614s 2025-09-26 03:09:10.608 43 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 5m 55.616s 2025-09-26 03:09:10.610 44 INFO STARTUP <main> AddressBookInitializer: Using the loaded state's address book and weight values.
node4 5m 55.623s 2025-09-26 03:09:10.617 45 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 55.624s 2025-09-26 03:09:10.618 46 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 56.652s 2025-09-26 03:09:11.646 47 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26151584] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=173480, randomLong=-3612480805885770272, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10060, randomLong=4237219725621717657, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=846659, data=35, exception=null] OS Health Check Report - Complete (took 1015 ms)
node4 5m 56.677s 2025-09-26 03:09:11.671 48 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 5m 56.803s 2025-09-26 03:09:11.797 49 INFO STARTUP <main> PcesUtilities: Span compaction completed for data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+03+31.142041701Z_seq0_minr1_maxr501_orgn0.pces, new upper bound is 391
node4 5m 56.805s 2025-09-26 03:09:11.799 50 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 5m 56.810s 2025-09-26 03:09:11.804 51 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 5m 56.877s 2025-09-26 03:09:11.871 52 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "InsIeg==", "port": 30124 }, { "ipAddressV4": "CoAAWQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "I+gaFw==", "port": 30125 }, { "ipAddressV4": "CoAAWw==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "InlxqA==", "port": 30126 }, { "ipAddressV4": "CoAAWA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "aJrAnw==", "port": 30127 }, { "ipAddressV4": "CoAACA==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I+9Y8g==", "port": 30128 }, { "ipAddressV4": "CoAAWg==", "port": 30128 }] }] }
node4 5m 56.895s 2025-09-26 03:09:11.889 53 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with state long 575583921648146648.
node4 5m 56.896s 2025-09-26 03:09:11.890 54 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with 334 rounds handled.
node4 5m 56.896s 2025-09-26 03:09:11.890 55 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 5m 56.897s 2025-09-26 03:09:11.891 56 INFO STARTUP <main> TransactionHandlingHistory: Log file found. Parsing previous history
node4 5m 57.650s 2025-09-26 03:09:12.644 57 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 334 Timestamp: 2025-09-26T03:06:00.152761Z Next consensus number: 11771 Legacy running event hash: 5b7614730a597f7a27f273a91bfe25027d6d839ba6c4bafa68c60a50aaf6e4c81bcbb6e6011b007d6eb4c526c58e084f Legacy running event mnemonic: human-mobile-enroll-urban Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1962935126 Root hash: 58ba5a38bf480a34644d16e867cc572408ee2fdd28663d3f6b66234121c442fcccbc2a9d639aae80e751c342933437cc (root) ConsistencyTestingToolState / blanket-either-blade-educate 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 hospital-quantum-congress-enroll 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 575583921648146648 /3 muscle-narrow-doctor-number 4 StringLeaf 334 /4 feel-smile-tourist-patch
node4 5m 57.906s 2025-09-26 03:09:12.900 59 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 5b7614730a597f7a27f273a91bfe25027d6d839ba6c4bafa68c60a50aaf6e4c81bcbb6e6011b007d6eb4c526c58e084f
node4 5m 57.919s 2025-09-26 03:09:12.913 60 INFO STARTUP <platformForkJoinThread-4> Shadowgraph: Shadowgraph starting from expiration threshold 306
node4 5m 57.927s 2025-09-26 03:09:12.921 62 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 5m 57.928s 2025-09-26 03:09:12.922 63 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 5m 57.930s 2025-09-26 03:09:12.924 64 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 5m 57.933s 2025-09-26 03:09:12.927 65 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 5m 57.935s 2025-09-26 03:09:12.929 66 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 5m 57.936s 2025-09-26 03:09:12.930 67 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 5m 57.938s 2025-09-26 03:09:12.932 68 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 306
node4 5m 57.943s 2025-09-26 03:09:12.937 69 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 194.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 5m 58.231s 2025-09-26 03:09:13.225 70 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:4 H:77ada61963b6 BR:332), num remaining: 4
node4 5m 58.232s 2025-09-26 03:09:13.226 71 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:4cb3f52948ef BR:332), num remaining: 3
node4 5m 58.233s 2025-09-26 03:09:13.227 72 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:d63061028e59 BR:332), num remaining: 2
node4 5m 58.234s 2025-09-26 03:09:13.228 73 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:7d6674f51364 BR:332), num remaining: 1
node4 5m 58.235s 2025-09-26 03:09:13.229 74 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:eb44072c578d BR:333), num remaining: 0
node4 5m 58.654s 2025-09-26 03:09:13.648 476 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 3,019 preconsensus events with max birth round 391. These events contained 4,187 transactions. 57 rounds reached consensus spanning 25.3 seconds of consensus time. The latest round to reach consensus is round 391. Replay took 715.0 milliseconds.
node4 5m 58.655s 2025-09-26 03:09:13.649 479 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 5m 58.657s 2025-09-26 03:09:13.651 480 INFO PLATFORM_STATUS <platformForkJoinThread-7> DefaultStatusStateMachine: Platform spent 710.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 5m 59.502s 2025-09-26 03:09:14.496 592 INFO RECONNECT <<platform-core: SyncProtocolWith1 4 to 1>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=391,ancientThreshold=364,expiredThreshold=306] remote ev=EventWindow[latestConsensusRound=778,ancientThreshold=751,expiredThreshold=677]
node4 5m 59.502s 2025-09-26 03:09:14.496 591 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=391,ancientThreshold=364,expiredThreshold=306] remote ev=EventWindow[latestConsensusRound=778,ancientThreshold=751,expiredThreshold=677]
node0 5m 59.573s 2025-09-26 03:09:14.567 8907 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=778,ancientThreshold=751,expiredThreshold=677] remote ev=EventWindow[latestConsensusRound=391,ancientThreshold=364,expiredThreshold=306]
node1 5m 59.573s 2025-09-26 03:09:14.567 8951 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=778,ancientThreshold=751,expiredThreshold=677] remote ev=EventWindow[latestConsensusRound=391,ancientThreshold=364,expiredThreshold=306]
node2 5m 59.573s 2025-09-26 03:09:14.567 8961 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=778,ancientThreshold=751,expiredThreshold=677] remote ev=EventWindow[latestConsensusRound=391,ancientThreshold=364,expiredThreshold=306]
node3 5m 59.573s 2025-09-26 03:09:14.567 9073 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=778,ancientThreshold=751,expiredThreshold=677] remote ev=EventWindow[latestConsensusRound=391,ancientThreshold=364,expiredThreshold=306]
node4 5m 59.642s 2025-09-26 03:09:14.636 594 INFO RECONNECT <<platform-core: SyncProtocolWith3 4 to 3>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=391,ancientThreshold=364,expiredThreshold=306] remote ev=EventWindow[latestConsensusRound=778,ancientThreshold=751,expiredThreshold=677]
node4 5m 59.642s 2025-09-26 03:09:14.636 593 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=391,ancientThreshold=364,expiredThreshold=306] remote ev=EventWindow[latestConsensusRound=778,ancientThreshold=751,expiredThreshold=677]
node4 5m 59.643s 2025-09-26 03:09:14.637 595 INFO PLATFORM_STATUS <platformForkJoinThread-2> DefaultStatusStateMachine: Platform spent 985.0 ms in OBSERVING. Now in BEHIND
node4 5m 59.644s 2025-09-26 03:09:14.638 596 INFO RECONNECT <platformForkJoinThread-5> ReconnectController: Starting ReconnectController
node4 5m 59.645s 2025-09-26 03:09:14.639 597 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, stopping gossip
node4 5m 59.797s 2025-09-26 03:09:14.791 598 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, start clearing queues
node4 5m 59.799s 2025-09-26 03:09:14.793 599 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Queues have been cleared
node4 5m 59.800s 2025-09-26 03:09:14.794 600 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: waiting for reconnect connection
node4 5m 59.800s 2025-09-26 03:09:14.794 601 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: acquired reconnect connection
node3 5m 59.887s 2025-09-26 03:09:14.881 9085 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Starting reconnect in the role of the sender {"receiving":false,"nodeId":3,"otherNodeId":4,"round":778} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node3 5m 59.888s 2025-09-26 03:09:14.882 9086 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: The following state will be sent to the learner:
Round: 778 Timestamp: 2025-09-26T03:09:13.365331Z Next consensus number: 22984 Legacy running event hash: 14f9fe8fbbdd532fa590dae5caaf396817f389648af0d17c6b87f615e00b50e593ae731daab7aa72c5091f56d47d8a9c Legacy running event mnemonic: palace-comfort-love-grass Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2138287084 Root hash: 38dd62bd1f67a6f1fc4b8f19a3422fc15dcb66f3002ec9f9e09daa65547f4acb0070155951ba5b9a0fa6eb1abcbfda72 (root) ConsistencyTestingToolState / brick-fish-undo-capable 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 chef-level-weird-elevator 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -8026088537578682213 /3 habit-grant-mixed-whisper 4 StringLeaf 778 /4 square-drop-police-pink
node3 5m 59.889s 2025-09-26 03:09:14.883 9087 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Sending signatures from nodes 0, 2, 3 (signing weight = 37500000000/50000000000) for state hash 38dd62bd1f67a6f1fc4b8f19a3422fc15dcb66f3002ec9f9e09daa65547f4acb0070155951ba5b9a0fa6eb1abcbfda72
node3 5m 59.889s 2025-09-26 03:09:14.883 9088 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Starting synchronization in the role of the sender.
node3 5m 59.895s 2025-09-26 03:09:14.889 9089 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node3 5m 59.903s 2025-09-26 03:09:14.897 9090 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@7fcb7594 start run()
node4 5m 59.955s 2025-09-26 03:09:14.949 602 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Starting reconnect in role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":3,"round":390} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node4 5m 59.958s 2025-09-26 03:09:14.952 603 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Receiving signed state signatures
node4 5m 59.960s 2025-09-26 03:09:14.954 604 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Received signatures from nodes 0, 2, 3
node4 5m 59.963s 2025-09-26 03:09:14.957 605 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls receiveTree()
node4 5m 59.963s 2025-09-26 03:09:14.957 606 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronizing tree
node4 5m 59.964s 2025-09-26 03:09:14.958 607 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 5m 59.970s 2025-09-26 03:09:14.964 608 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@3dcffaa start run()
node4 5m 59.975s 2025-09-26 03:09:14.969 609 INFO STARTUP <<work group learning-synchronizer: async-input-stream #0>> ConsistencyTestingToolState: New State Constructed.
node3 6.001m 2025-09-26 03:09:15.051 9109 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@7fcb7594 finish run()
node3 6.001m 2025-09-26 03:09:15.052 9110 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> TeachingSynchronizer: finished sending tree
node3 6.001m 2025-09-26 03:09:15.053 9111 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node3 6.001m 2025-09-26 03:09:15.054 9112 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@25c1040c start run()
node4 6.003m 2025-09-26 03:09:15.162 633 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 6.003m 2025-09-26 03:09:15.162 634 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 6.003m 2025-09-26 03:09:15.163 635 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@3dcffaa finish run()
node4 6.003m 2025-09-26 03:09:15.164 636 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 6.003m 2025-09-26 03:09:15.164 637 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 6.003m 2025-09-26 03:09:15.167 638 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@17d407eb start run()
node4 6.004m 2025-09-26 03:09:15.231 639 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): firstLeafPath: 1 -> 1, lastLeafPath: 1 -> 1
node4 6.004m 2025-09-26 03:09:15.232 640 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): done
node4 6.004m 2025-09-26 03:09:15.234 641 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 6.004m 2025-09-26 03:09:15.234 642 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call nodeRemover.allNodesReceived()
node4 6.004m 2025-09-26 03:09:15.235 643 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived()
node4 6.004m 2025-09-26 03:09:15.235 644 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived(): done
node4 6.004m 2025-09-26 03:09:15.235 645 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call root.endLearnerReconnect()
node4 6.004m 2025-09-26 03:09:15.235 646 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call reconnectIterator.close()
node4 6.004m 2025-09-26 03:09:15.235 647 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call setHashPrivate()
node3 6.005m 2025-09-26 03:09:15.304 9116 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@25c1040c finish run()
node3 6.005m 2025-09-26 03:09:15.305 9117 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> TeachingSynchronizer: finished sending tree
node3 6.005m 2025-09-26 03:09:15.308 9120 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Finished synchronization in the role of the sender.
node4 6.007m 2025-09-26 03:09:15.386 657 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call postInit()
node4 6.007m 2025-09-26 03:09:15.387 659 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: endLearnerReconnect() complete
node4 6.007m 2025-09-26 03:09:15.388 660 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: close() complete
node4 6.007m 2025-09-26 03:09:15.388 661 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 6.007m 2025-09-26 03:09:15.389 662 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@17d407eb finish run()
node4 6.007m 2025-09-26 03:09:15.389 663 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 6.007m 2025-09-26 03:09:15.390 664 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronization complete
node4 6.007m 2025-09-26 03:09:15.390 665 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls initialize()
node4 6.007m 2025-09-26 03:09:15.390 666 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initializing tree
node4 6.007m 2025-09-26 03:09:15.391 667 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initialization complete
node4 6.007m 2025-09-26 03:09:15.391 668 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls hash()
node4 6.007m 2025-09-26 03:09:15.391 669 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing tree
node4 6.007m 2025-09-26 03:09:15.392 670 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing complete
node4 6.007m 2025-09-26 03:09:15.392 671 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls logStatistics()
node4 6.007m 2025-09-26 03:09:15.395 672 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: Finished synchronization {"timeInSeconds":0.432,"hashTimeInSeconds":0.001,"initializationTimeInSeconds":0.001,"totalNodes":12,"leafNodes":7,"redundantLeafNodes":4,"internalNodes":5,"redundantInternalNodes":2} [com.swirlds.logging.legacy.payload.SynchronizationCompletePayload]
node4 6.007m 2025-09-26 03:09:15.396 673 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: ReconnectMapMetrics: transfersFromTeacher=12; transfersFromLearner=10; internalHashes=0; internalCleanHashes=0; internalData=0; internalCleanData=0; leafHashes=3; leafCleanHashes=3; leafData=7; leafCleanData=4
node4 6.007m 2025-09-26 03:09:15.396 674 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner is done synchronizing
node4 6.007m 2025-09-26 03:09:15.399 675 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Reconnect data usage report {"dataMegabytes":0.006056785583496094} [com.swirlds.logging.legacy.payload.ReconnectDataUsagePayload]
node4 6.007m 2025-09-26 03:09:15.401 676 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Finished reconnect in the role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":3,"round":778,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6.007m 2025-09-26 03:09:15.402 677 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Information for state received during reconnect:
Round: 778 Timestamp: 2025-09-26T03:09:13.365331Z Next consensus number: 22984 Legacy running event hash: 14f9fe8fbbdd532fa590dae5caaf396817f389648af0d17c6b87f615e00b50e593ae731daab7aa72c5091f56d47d8a9c Legacy running event mnemonic: palace-comfort-love-grass Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2138287084 Root hash: 38dd62bd1f67a6f1fc4b8f19a3422fc15dcb66f3002ec9f9e09daa65547f4acb0070155951ba5b9a0fa6eb1abcbfda72 (root) ConsistencyTestingToolState / brick-fish-undo-capable 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 chef-level-weird-elevator 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -8026088537578682213 /3 habit-grant-mixed-whisper 4 StringLeaf 778 /4 square-drop-police-pink
node4 6.007m 2025-09-26 03:09:15.403 679 DEBUG RECONNECT <<reconnect: reconnect-controller>> ReconnectStateLoader: `loadReconnectState` : reloading state
node4 6.007m 2025-09-26 03:09:15.403 680 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with state long -8026088537578682213.
node4 6.007m 2025-09-26 03:09:15.403 681 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with 778 rounds handled.
node4 6.007m 2025-09-26 03:09:15.404 682 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6.007m 2025-09-26 03:09:15.404 683 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Log file found. Parsing previous history
node4 6.007m 2025-09-26 03:09:15.428 690 INFO STATE_TO_DISK <<reconnect: reconnect-controller>> DefaultSavedStateController: Signed state from round 778 created, will eventually be written to disk, for reason: RECONNECT
node4 6.007m 2025-09-26 03:09:15.428 691 INFO PLATFORM_STATUS <platformForkJoinThread-2> DefaultStatusStateMachine: Platform spent 790.0 ms in BEHIND. Now in RECONNECT_COMPLETE
node4 6.007m 2025-09-26 03:09:15.429 692 INFO STARTUP <platformForkJoinThread-6> Shadowgraph: Shadowgraph starting from expiration threshold 751
node4 6.007m 2025-09-26 03:09:15.431 695 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 778 state to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/778
node4 6.007m 2025-09-26 03:09:15.433 696 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/2 for round 778
node4 6.007m 2025-09-26 03:09:15.443 706 INFO EVENT_STREAM <<reconnect: reconnect-controller>> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 14f9fe8fbbdd532fa590dae5caaf396817f389648af0d17c6b87f615e00b50e593ae731daab7aa72c5091f56d47d8a9c
node4 6.008m 2025-09-26 03:09:15.444 707 INFO STARTUP <platformForkJoinThread-8> PcesFileManager: Due to recent operations on this node, the local preconsensus event stream will have a discontinuity. The last file with the old origin round is 2025-09-26T03+03+31.142041701Z_seq0_minr1_maxr391_orgn0.pces. All future files will have an origin round of 778.
node3 6.008m 2025-09-26 03:09:15.472 9129 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> ReconnectTeacher: Finished reconnect in the role of the sender. {"receiving":false,"nodeId":3,"otherNodeId":4,"round":778,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6.010m 2025-09-26 03:09:15.574 741 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/2 for round 778
node4 6.010m 2025-09-26 03:09:15.577 742 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 778 Timestamp: 2025-09-26T03:09:13.365331Z Next consensus number: 22984 Legacy running event hash: 14f9fe8fbbdd532fa590dae5caaf396817f389648af0d17c6b87f615e00b50e593ae731daab7aa72c5091f56d47d8a9c Legacy running event mnemonic: palace-comfort-love-grass Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2138287084 Root hash: 38dd62bd1f67a6f1fc4b8f19a3422fc15dcb66f3002ec9f9e09daa65547f4acb0070155951ba5b9a0fa6eb1abcbfda72 (root) ConsistencyTestingToolState / brick-fish-undo-capable 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 chef-level-weird-elevator 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -8026088537578682213 /3 habit-grant-mixed-whisper 4 StringLeaf 778 /4 square-drop-police-pink
node4 6.011m 2025-09-26 03:09:15.631 743 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+03+31.142041701Z_seq0_minr1_maxr391_orgn0.pces
node4 6.011m 2025-09-26 03:09:15.633 744 WARN STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: No preconsensus event files meeting specified criteria found to copy. Lower bound: 751
node4 6.011m 2025-09-26 03:09:15.640 745 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 778 to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/778 {"round":778,"freezeState":false,"reason":"RECONNECT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/778/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6.011m 2025-09-26 03:09:15.646 746 INFO PLATFORM_STATUS <platformForkJoinThread-8> DefaultStatusStateMachine: Platform spent 216.0 ms in RECONNECT_COMPLETE. Now in CHECKING
node4 6.016m 2025-09-26 03:09:15.934 747 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 6.016m 2025-09-26 03:09:15.940 748 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 6m 1.458s 2025-09-26 03:09:16.452 749 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:0d8a85a30535 BR:776), num remaining: 3
node4 6m 1.459s 2025-09-26 03:09:16.453 750 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:f3389664549e BR:776), num remaining: 2
node4 6m 1.459s 2025-09-26 03:09:16.453 751 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:54ce57bcca82 BR:776), num remaining: 1
node4 6m 1.459s 2025-09-26 03:09:16.453 752 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:6353d513d603 BR:776), num remaining: 0
node0 6m 2.665s 2025-09-26 03:09:17.659 8985 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=785,ancientThreshold=758,expiredThreshold=684] remote ev=EventWindow[latestConsensusRound=391,ancientThreshold=364,expiredThreshold=306]
node4 6m 2.735s 2025-09-26 03:09:17.729 821 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=391,ancientThreshold=364,expiredThreshold=306] remote ev=EventWindow[latestConsensusRound=785,ancientThreshold=758,expiredThreshold=684]
node4 6m 2.736s 2025-09-26 03:09:17.730 822 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> RpcPeerHandler: Latest event window is not really falling behind, will retry sync local ev=EventWindow[latestConsensusRound=785,ancientThreshold=758,expiredThreshold=751] remote ev=EventWindow[latestConsensusRound=785,ancientThreshold=758,expiredThreshold=684]
node1 6m 4.583s 2025-09-26 03:09:19.577 9097 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=789,ancientThreshold=762,expiredThreshold=688] remote ev=EventWindow[latestConsensusRound=391,ancientThreshold=364,expiredThreshold=306]
node4 6m 4.654s 2025-09-26 03:09:19.648 873 INFO RECONNECT <<platform-core: SyncProtocolWith1 4 to 1>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=391,ancientThreshold=364,expiredThreshold=306] remote ev=EventWindow[latestConsensusRound=789,ancientThreshold=762,expiredThreshold=688]
node4 6m 4.654s 2025-09-26 03:09:19.648 874 INFO RECONNECT <<platform-core: SyncProtocolWith1 4 to 1>> RpcPeerHandler: Latest event window is not really falling behind, will retry sync local ev=EventWindow[latestConsensusRound=789,ancientThreshold=762,expiredThreshold=751] remote ev=EventWindow[latestConsensusRound=789,ancientThreshold=762,expiredThreshold=688]
node4 6m 6.397s 2025-09-26 03:09:21.391 894 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 5.7 s in CHECKING. Now in ACTIVE
node0 6m 45.921s 2025-09-26 03:10:00.915 9997 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 878 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 6m 45.965s 2025-09-26 03:10:00.959 10072 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 878 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 6m 45.993s 2025-09-26 03:10:00.987 10187 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 878 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 6m 46.003s 2025-09-26 03:10:00.997 1823 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 878 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 6m 46.059s 2025-09-26 03:10:01.053 10059 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 878 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 6m 46.178s 2025-09-26 03:10:01.172 10075 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 878 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/878
node2 6m 46.179s 2025-09-26 03:10:01.173 10076 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 878
node0 6m 46.209s 2025-09-26 03:10:01.203 10000 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 878 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/878
node0 6m 46.210s 2025-09-26 03:10:01.204 10001 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 878
node1 6m 46.219s 2025-09-26 03:10:01.213 10062 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 878 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/878
node1 6m 46.220s 2025-09-26 03:10:01.214 10063 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 878
node3 6m 46.248s 2025-09-26 03:10:01.242 10200 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 878 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/878
node3 6m 46.248s 2025-09-26 03:10:01.242 10201 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/44 for round 878
node2 6m 46.264s 2025-09-26 03:10:01.258 10111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 878
node2 6m 46.267s 2025-09-26 03:10:01.261 10112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 878 Timestamp: 2025-09-26T03:10:00.092694Z Next consensus number: 26587 Legacy running event hash: 5061e7f69f50460df6ec3c14bc444c88e2d23a3444ac2dcbf91ea4c0e45c49fd87487401920f42bd886ac3252dc5d554 Legacy running event mnemonic: around-access-bid-script Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1742942718 Root hash: 6e134dabd54efdce89c96fbce2905558a7bf7975faba0e16e9bb7dc2fa3ccbbac7ad2d3331fdaaf9f77a8e1c8af0a96d (root) ConsistencyTestingToolState / second-rely-film-edge 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 more-quit-illegal-control 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 2824755168719257280 /3 hybrid-oven-release-sing 4 StringLeaf 878 /4 cup-long-social-science
node2 6m 46.273s 2025-09-26 03:10:01.267 10113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+03+31.391433828Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+07+14.096277183Z_seq1_minr474_maxr5474_orgn0.pces
node2 6m 46.276s 2025-09-26 03:10:01.270 10114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 851 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+07+14.096277183Z_seq1_minr474_maxr5474_orgn0.pces
node2 6m 46.276s 2025-09-26 03:10:01.270 10115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 6m 46.283s 2025-09-26 03:10:01.277 10116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 6m 46.284s 2025-09-26 03:10:01.278 10117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 878 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/878 {"round":878,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/878/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 6m 46.285s 2025-09-26 03:10:01.279 10118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/197
node0 6m 46.300s 2025-09-26 03:10:01.294 10036 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 878
node0 6m 46.302s 2025-09-26 03:10:01.296 10037 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 878 Timestamp: 2025-09-26T03:10:00.092694Z Next consensus number: 26587 Legacy running event hash: 5061e7f69f50460df6ec3c14bc444c88e2d23a3444ac2dcbf91ea4c0e45c49fd87487401920f42bd886ac3252dc5d554 Legacy running event mnemonic: around-access-bid-script Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1742942718 Root hash: 6e134dabd54efdce89c96fbce2905558a7bf7975faba0e16e9bb7dc2fa3ccbbac7ad2d3331fdaaf9f77a8e1c8af0a96d (root) ConsistencyTestingToolState / second-rely-film-edge 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 more-quit-illegal-control 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 2824755168719257280 /3 hybrid-oven-release-sing 4 StringLeaf 878 /4 cup-long-social-science
node4 6m 46.303s 2025-09-26 03:10:01.297 1826 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 878 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/878
node4 6m 46.304s 2025-09-26 03:10:01.298 1827 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 878
node0 6m 46.312s 2025-09-26 03:10:01.306 10038 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+07+14.082954667Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+03+31.273927897Z_seq0_minr1_maxr501_orgn0.pces
node0 6m 46.312s 2025-09-26 03:10:01.306 10039 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 851 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+07+14.082954667Z_seq1_minr474_maxr5474_orgn0.pces
node0 6m 46.312s 2025-09-26 03:10:01.306 10040 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 6m 46.312s 2025-09-26 03:10:01.306 10094 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 878
node1 6m 46.315s 2025-09-26 03:10:01.309 10095 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 878 Timestamp: 2025-09-26T03:10:00.092694Z Next consensus number: 26587 Legacy running event hash: 5061e7f69f50460df6ec3c14bc444c88e2d23a3444ac2dcbf91ea4c0e45c49fd87487401920f42bd886ac3252dc5d554 Legacy running event mnemonic: around-access-bid-script Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1742942718 Root hash: 6e134dabd54efdce89c96fbce2905558a7bf7975faba0e16e9bb7dc2fa3ccbbac7ad2d3331fdaaf9f77a8e1c8af0a96d (root) ConsistencyTestingToolState / second-rely-film-edge 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 more-quit-illegal-control 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 2824755168719257280 /3 hybrid-oven-release-sing 4 StringLeaf 878 /4 cup-long-social-science
node0 6m 46.320s 2025-09-26 03:10:01.314 10041 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 6m 46.320s 2025-09-26 03:10:01.314 10042 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 878 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/878 {"round":878,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/878/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 6m 46.322s 2025-09-26 03:10:01.316 10043 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/197
node1 6m 46.324s 2025-09-26 03:10:01.318 10096 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+07+14.110613394Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+03+31.280166514Z_seq0_minr1_maxr501_orgn0.pces
node1 6m 46.324s 2025-09-26 03:10:01.318 10097 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 851 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+07+14.110613394Z_seq1_minr474_maxr5474_orgn0.pces
node1 6m 46.324s 2025-09-26 03:10:01.318 10098 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 6m 46.332s 2025-09-26 03:10:01.326 10099 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 6m 46.332s 2025-09-26 03:10:01.326 10100 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 878 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/878 {"round":878,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/878/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 6m 46.332s 2025-09-26 03:10:01.326 10232 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/44 for round 878
node1 6m 46.333s 2025-09-26 03:10:01.327 10101 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/197
node3 6m 46.335s 2025-09-26 03:10:01.329 10233 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 878 Timestamp: 2025-09-26T03:10:00.092694Z Next consensus number: 26587 Legacy running event hash: 5061e7f69f50460df6ec3c14bc444c88e2d23a3444ac2dcbf91ea4c0e45c49fd87487401920f42bd886ac3252dc5d554 Legacy running event mnemonic: around-access-bid-script Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1742942718 Root hash: 6e134dabd54efdce89c96fbce2905558a7bf7975faba0e16e9bb7dc2fa3ccbbac7ad2d3331fdaaf9f77a8e1c8af0a96d (root) ConsistencyTestingToolState / second-rely-film-edge 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 more-quit-illegal-control 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 2824755168719257280 /3 hybrid-oven-release-sing 4 StringLeaf 878 /4 cup-long-social-science
node3 6m 46.341s 2025-09-26 03:10:01.335 10234 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+03+31.256200715Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+07+14.042388201Z_seq1_minr474_maxr5474_orgn0.pces
node3 6m 46.341s 2025-09-26 03:10:01.335 10235 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 851 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+07+14.042388201Z_seq1_minr474_maxr5474_orgn0.pces
node3 6m 46.342s 2025-09-26 03:10:01.336 10236 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 6m 46.349s 2025-09-26 03:10:01.343 10237 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 6m 46.349s 2025-09-26 03:10:01.343 10238 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 878 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/878 {"round":878,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/878/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 6m 46.351s 2025-09-26 03:10:01.345 10239 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/197
node4 6m 46.409s 2025-09-26 03:10:01.403 1865 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 878
node4 6m 46.411s 2025-09-26 03:10:01.405 1866 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 878 Timestamp: 2025-09-26T03:10:00.092694Z Next consensus number: 26587 Legacy running event hash: 5061e7f69f50460df6ec3c14bc444c88e2d23a3444ac2dcbf91ea4c0e45c49fd87487401920f42bd886ac3252dc5d554 Legacy running event mnemonic: around-access-bid-script Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1742942718 Root hash: 6e134dabd54efdce89c96fbce2905558a7bf7975faba0e16e9bb7dc2fa3ccbbac7ad2d3331fdaaf9f77a8e1c8af0a96d (root) ConsistencyTestingToolState / second-rely-film-edge 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 more-quit-illegal-control 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf 2824755168719257280 /3 hybrid-oven-release-sing 4 StringLeaf 878 /4 cup-long-social-science
node4 6m 46.420s 2025-09-26 03:10:01.414 1867 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+03+31.142041701Z_seq0_minr1_maxr391_orgn0.pces Last file: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+09+15.883842205Z_seq1_minr751_maxr1251_orgn778.pces
node4 6m 46.421s 2025-09-26 03:10:01.415 1868 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 851 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+09+15.883842205Z_seq1_minr751_maxr1251_orgn778.pces
node4 6m 46.421s 2025-09-26 03:10:01.415 1869 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 6m 46.426s 2025-09-26 03:10:01.420 1870 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 6m 46.427s 2025-09-26 03:10:01.421 1871 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 878 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/878 {"round":878,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/878/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 46.429s 2025-09-26 03:10:01.423 1872 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1
node2 7m 46.204s 2025-09-26 03:11:01.198 11558 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1007 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 7m 46.208s 2025-09-26 03:11:01.202 11517 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1007 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 7m 46.210s 2025-09-26 03:11:01.204 11671 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1007 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 7m 46.222s 2025-09-26 03:11:01.216 11463 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1007 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 7m 46.243s 2025-09-26 03:11:01.237 3304 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1007 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 7m 46.386s 2025-09-26 03:11:01.380 11674 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1007 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1007
node3 7m 46.386s 2025-09-26 03:11:01.380 11675 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 1007
node2 7m 46.473s 2025-09-26 03:11:01.467 11561 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1007 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1007
node2 7m 46.474s 2025-09-26 03:11:01.468 11562 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 1007
node3 7m 46.478s 2025-09-26 03:11:01.472 11710 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 1007
node3 7m 46.480s 2025-09-26 03:11:01.474 11711 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1007 Timestamp: 2025-09-26T03:11:00.298794Z Next consensus number: 31429 Legacy running event hash: 399e5a361e5301a050e7d59b04c77e5c799bba09c774692f497fd9ddf5d4a71bd4bfecd6d42ce8224d271973b8f408e7 Legacy running event mnemonic: critic-egg-skill-cross Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1777388498 Root hash: e5aa263a0aa413f392f03b4b3a5ba6204965266d0f1aa0d81c706615c6c289127a1ccbcdef6c79cedacdcf7afa01bf20 (root) ConsistencyTestingToolState / enlist-shrug-fee-lazy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 indicate-gun-supreme-digital 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -7896553829280765051 /3 rate-train-size-orchard 4 StringLeaf 1007 /4 cliff-art-meadow-hold
node3 7m 46.487s 2025-09-26 03:11:01.481 11712 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+03+31.256200715Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+07+14.042388201Z_seq1_minr474_maxr5474_orgn0.pces
node3 7m 46.487s 2025-09-26 03:11:01.481 11713 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 980 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T03+07+14.042388201Z_seq1_minr474_maxr5474_orgn0.pces
node3 7m 46.488s 2025-09-26 03:11:01.482 11714 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 7m 46.498s 2025-09-26 03:11:01.492 11715 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 7m 46.498s 2025-09-26 03:11:01.492 11716 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1007 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1007 {"round":1007,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1007/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 7m 46.500s 2025-09-26 03:11:01.494 11717 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/334
node0 7m 46.551s 2025-09-26 03:11:01.545 11476 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1007 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1007
node1 7m 46.551s 2025-09-26 03:11:01.545 11520 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1007 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1007
node0 7m 46.552s 2025-09-26 03:11:01.546 11477 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 1007
node1 7m 46.552s 2025-09-26 03:11:01.546 11521 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 1007
node2 7m 46.561s 2025-09-26 03:11:01.555 11593 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 1007
node2 7m 46.563s 2025-09-26 03:11:01.557 11594 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1007 Timestamp: 2025-09-26T03:11:00.298794Z Next consensus number: 31429 Legacy running event hash: 399e5a361e5301a050e7d59b04c77e5c799bba09c774692f497fd9ddf5d4a71bd4bfecd6d42ce8224d271973b8f408e7 Legacy running event mnemonic: critic-egg-skill-cross Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1777388498 Root hash: e5aa263a0aa413f392f03b4b3a5ba6204965266d0f1aa0d81c706615c6c289127a1ccbcdef6c79cedacdcf7afa01bf20 (root) ConsistencyTestingToolState / enlist-shrug-fee-lazy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 indicate-gun-supreme-digital 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -7896553829280765051 /3 rate-train-size-orchard 4 StringLeaf 1007 /4 cliff-art-meadow-hold
node2 7m 46.571s 2025-09-26 03:11:01.565 11595 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+03+31.391433828Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+07+14.096277183Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 46.572s 2025-09-26 03:11:01.566 11596 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 980 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T03+07+14.096277183Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 46.572s 2025-09-26 03:11:01.566 11597 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 7m 46.583s 2025-09-26 03:11:01.577 11598 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 7m 46.583s 2025-09-26 03:11:01.577 11599 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1007 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1007 {"round":1007,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1007/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 7m 46.585s 2025-09-26 03:11:01.579 11600 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/334
node4 7m 46.607s 2025-09-26 03:11:01.601 3317 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1007 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1007
node4 7m 46.608s 2025-09-26 03:11:01.602 3318 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/14 for round 1007
node1 7m 46.642s 2025-09-26 03:11:01.636 11564 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 1007
node1 7m 46.644s 2025-09-26 03:11:01.638 11565 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1007 Timestamp: 2025-09-26T03:11:00.298794Z Next consensus number: 31429 Legacy running event hash: 399e5a361e5301a050e7d59b04c77e5c799bba09c774692f497fd9ddf5d4a71bd4bfecd6d42ce8224d271973b8f408e7 Legacy running event mnemonic: critic-egg-skill-cross Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1777388498 Root hash: e5aa263a0aa413f392f03b4b3a5ba6204965266d0f1aa0d81c706615c6c289127a1ccbcdef6c79cedacdcf7afa01bf20 (root) ConsistencyTestingToolState / enlist-shrug-fee-lazy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 indicate-gun-supreme-digital 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -7896553829280765051 /3 rate-train-size-orchard 4 StringLeaf 1007 /4 cliff-art-meadow-hold
node0 7m 46.645s 2025-09-26 03:11:01.639 11516 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 1007
node0 7m 46.647s 2025-09-26 03:11:01.641 11517 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1007 Timestamp: 2025-09-26T03:11:00.298794Z Next consensus number: 31429 Legacy running event hash: 399e5a361e5301a050e7d59b04c77e5c799bba09c774692f497fd9ddf5d4a71bd4bfecd6d42ce8224d271973b8f408e7 Legacy running event mnemonic: critic-egg-skill-cross Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1777388498 Root hash: e5aa263a0aa413f392f03b4b3a5ba6204965266d0f1aa0d81c706615c6c289127a1ccbcdef6c79cedacdcf7afa01bf20 (root) ConsistencyTestingToolState / enlist-shrug-fee-lazy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 indicate-gun-supreme-digital 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -7896553829280765051 /3 rate-train-size-orchard 4 StringLeaf 1007 /4 cliff-art-meadow-hold
node1 7m 46.651s 2025-09-26 03:11:01.645 11566 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+07+14.110613394Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+03+31.280166514Z_seq0_minr1_maxr501_orgn0.pces
node1 7m 46.652s 2025-09-26 03:11:01.646 11567 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 980 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T03+07+14.110613394Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 46.652s 2025-09-26 03:11:01.646 11568 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 7m 46.654s 2025-09-26 03:11:01.648 11518 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+07+14.082954667Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+03+31.273927897Z_seq0_minr1_maxr501_orgn0.pces
node0 7m 46.654s 2025-09-26 03:11:01.648 11519 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 980 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T03+07+14.082954667Z_seq1_minr474_maxr5474_orgn0.pces
node0 7m 46.655s 2025-09-26 03:11:01.649 11520 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 7m 46.663s 2025-09-26 03:11:01.657 11569 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 7m 46.663s 2025-09-26 03:11:01.657 11570 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1007 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1007 {"round":1007,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1007/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7m 46.665s 2025-09-26 03:11:01.659 11521 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 7m 46.665s 2025-09-26 03:11:01.659 11571 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/334
node0 7m 46.666s 2025-09-26 03:11:01.660 11522 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1007 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1007 {"round":1007,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1007/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7m 46.667s 2025-09-26 03:11:01.661 11523 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/334
node4 7m 46.709s 2025-09-26 03:11:01.703 3360 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/14 for round 1007
node4 7m 46.711s 2025-09-26 03:11:01.705 3361 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1007 Timestamp: 2025-09-26T03:11:00.298794Z Next consensus number: 31429 Legacy running event hash: 399e5a361e5301a050e7d59b04c77e5c799bba09c774692f497fd9ddf5d4a71bd4bfecd6d42ce8224d271973b8f408e7 Legacy running event mnemonic: critic-egg-skill-cross Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1777388498 Root hash: e5aa263a0aa413f392f03b4b3a5ba6204965266d0f1aa0d81c706615c6c289127a1ccbcdef6c79cedacdcf7afa01bf20 (root) ConsistencyTestingToolState / enlist-shrug-fee-lazy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 indicate-gun-supreme-digital 1 SingletonNode RosterService.ROSTER_STATE /1 dilemma-inhale-stock-retire 2 VirtualMap RosterService.ROSTERS /2 isolate-clerk-worry-scrub 3 StringLeaf -7896553829280765051 /3 rate-train-size-orchard 4 StringLeaf 1007 /4 cliff-art-meadow-hold
node4 7m 46.719s 2025-09-26 03:11:01.713 3362 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+03+31.142041701Z_seq0_minr1_maxr391_orgn0.pces Last file: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+09+15.883842205Z_seq1_minr751_maxr1251_orgn778.pces
node4 7m 46.719s 2025-09-26 03:11:01.713 3363 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 980 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T03+09+15.883842205Z_seq1_minr751_maxr1251_orgn778.pces
node4 7m 46.719s 2025-09-26 03:11:01.713 3364 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 7m 46.725s 2025-09-26 03:11:01.719 3365 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 7m 46.726s 2025-09-26 03:11:01.720 3366 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1007 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1007 {"round":1007,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1007/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 7m 46.727s 2025-09-26 03:11:01.721 3367 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/62