Node ID







Columns











Log Level





Log Marker








Class


















































node4 0.000ns 2025-12-04 02:37:38.681 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 85.000ms 2025-12-04 02:37:38.766 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 100.000ms 2025-12-04 02:37:38.781 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 177.000ms 2025-12-04 02:37:38.858 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 207.000ms 2025-12-04 02:37:38.888 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 213.000ms 2025-12-04 02:37:38.894 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 225.000ms 2025-12-04 02:37:38.906 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 271.000ms 2025-12-04 02:37:38.952 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node1 286.000ms 2025-12-04 02:37:38.967 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 419.000ms 2025-12-04 02:37:39.100 4 INFO STARTUP <main> Browser: The following nodes [1] are set to run locally
node1 425.000ms 2025-12-04 02:37:39.106 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node1 437.000ms 2025-12-04 02:37:39.118 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 632.000ms 2025-12-04 02:37:39.313 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 633.000ms 2025-12-04 02:37:39.314 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node1 855.000ms 2025-12-04 02:37:39.536 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node1 856.000ms 2025-12-04 02:37:39.537 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node2 1.456s 2025-12-04 02:37:40.137 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 1.486s 2025-12-04 02:37:40.167 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 853ms
node4 1.497s 2025-12-04 02:37:40.178 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 1.501s 2025-12-04 02:37:40.182 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 1.542s 2025-12-04 02:37:40.223 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node2 1.551s 2025-12-04 02:37:40.232 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node2 1.567s 2025-12-04 02:37:40.248 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 1.609s 2025-12-04 02:37:40.290 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 1.613s 2025-12-04 02:37:40.294 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node1 1.632s 2025-12-04 02:37:40.313 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 775ms
node1 1.642s 2025-12-04 02:37:40.323 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node1 1.645s 2025-12-04 02:37:40.326 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 1.677s 2025-12-04 02:37:40.358 4 INFO STARTUP <main> Browser: The following nodes [2] are set to run locally
node1 1.681s 2025-12-04 02:37:40.362 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node2 1.683s 2025-12-04 02:37:40.364 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node2 1.696s 2025-12-04 02:37:40.377 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 1.740s 2025-12-04 02:37:40.421 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node1 1.740s 2025-12-04 02:37:40.421 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node0 2.012s 2025-12-04 02:37:40.693 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node0 2.106s 2025-12-04 02:37:40.787 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node0 2.123s 2025-12-04 02:37:40.804 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 2.130s 2025-12-04 02:37:40.811 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node2 2.131s 2025-12-04 02:37:40.812 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node0 2.238s 2025-12-04 02:37:40.919 4 INFO STARTUP <main> Browser: The following nodes [0] are set to run locally
node0 2.244s 2025-12-04 02:37:40.925 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node0 2.256s 2025-12-04 02:37:40.937 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 2.438s 2025-12-04 02:37:41.119 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node3 2.546s 2025-12-04 02:37:41.227 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node3 2.564s 2025-12-04 02:37:41.245 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 2.684s 2025-12-04 02:37:41.365 4 INFO STARTUP <main> Browser: The following nodes [3] are set to run locally
node3 2.690s 2025-12-04 02:37:41.371 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node0 2.693s 2025-12-04 02:37:41.374 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node0 2.694s 2025-12-04 02:37:41.375 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node3 2.704s 2025-12-04 02:37:41.385 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 3.156s 2025-12-04 02:37:41.837 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node3 3.157s 2025-12-04 02:37:41.838 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node2 3.226s 2025-12-04 02:37:41.907 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1094ms
node2 3.235s 2025-12-04 02:37:41.916 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node2 3.238s 2025-12-04 02:37:41.919 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 3.277s 2025-12-04 02:37:41.958 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node2 3.336s 2025-12-04 02:37:42.017 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node2 3.337s 2025-12-04 02:37:42.018 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 3.641s 2025-12-04 02:37:42.322 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 3.731s 2025-12-04 02:37:42.412 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 3.733s 2025-12-04 02:37:42.414 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node4 3.734s 2025-12-04 02:37:42.415 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 3.765s 2025-12-04 02:37:42.446 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1070ms
node1 3.767s 2025-12-04 02:37:42.448 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node0 3.774s 2025-12-04 02:37:42.455 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node0 3.778s 2025-12-04 02:37:42.459 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 3.821s 2025-12-04 02:37:42.502 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node1 3.854s 2025-12-04 02:37:42.535 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 3.856s 2025-12-04 02:37:42.537 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node1 3.857s 2025-12-04 02:37:42.538 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 3.885s 2025-12-04 02:37:42.566 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node0 3.886s 2025-12-04 02:37:42.567 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node3 4.019s 2025-12-04 02:37:42.700 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 861ms
node3 4.028s 2025-12-04 02:37:42.709 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node3 4.031s 2025-12-04 02:37:42.712 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 4.073s 2025-12-04 02:37:42.754 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node3 4.145s 2025-12-04 02:37:42.826 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node3 4.146s 2025-12-04 02:37:42.827 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 4.509s 2025-12-04 02:37:43.190 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.520s 2025-12-04 02:37:43.201 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 4.526s 2025-12-04 02:37:43.207 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node4 4.537s 2025-12-04 02:37:43.218 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.539s 2025-12-04 02:37:43.220 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.640s 2025-12-04 02:37:43.321 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.651s 2025-12-04 02:37:43.332 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node1 4.658s 2025-12-04 02:37:43.339 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node1 4.670s 2025-12-04 02:37:43.351 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.672s 2025-12-04 02:37:43.353 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 5.343s 2025-12-04 02:37:44.024 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node2 5.430s 2025-12-04 02:37:44.111 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 5.432s 2025-12-04 02:37:44.113 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node2 5.433s 2025-12-04 02:37:44.114 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5.661s 2025-12-04 02:37:44.342 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26363054] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=316621, randomLong=-3926998998078570757, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10210, randomLong=-1609148538921788928, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1103460, data=35, exception=null] OS Health Check Report - Complete (took 1020 ms)
node4 5.692s 2025-12-04 02:37:44.373 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 5.699s 2025-12-04 02:37:44.380 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 5.701s 2025-12-04 02:37:44.382 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 5.779s 2025-12-04 02:37:44.460 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ioh9hA==", "port": 30124 }, { "ipAddressV4": "CoAAPQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IqtEug==", "port": 30125 }, { "ipAddressV4": "CoAAPg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHF6nQ==", "port": 30126 }, { "ipAddressV4": "CoAAQA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "gtN3xw==", "port": 30127 }, { "ipAddressV4": "CoAAQQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "aMWy1A==", "port": 30128 }, { "ipAddressV4": "CoAAPw==", "port": 30128 }] }] }
node1 5.780s 2025-12-04 02:37:44.461 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26318638] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=188290, randomLong=-7661110119074911087, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10220, randomLong=8252292378353464539, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1133710, data=35, exception=null] OS Health Check Report - Complete (took 1021 ms)
node4 5.799s 2025-12-04 02:37:44.480 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 5.799s 2025-12-04 02:37:44.480 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node1 5.810s 2025-12-04 02:37:44.491 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 5.813s 2025-12-04 02:37:44.494 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: b4562bb922f0edb0efd08d79c98af8bfb9d7c4fed7fbe29938029f74612b25fdcacc3002a55922c1da3253b65ce23040 (root) ConsistencyTestingToolState / inspire-jaguar-fun-energy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy
node1 5.817s 2025-12-04 02:37:44.498 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node1 5.820s 2025-12-04 02:37:44.501 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node0 5.875s 2025-12-04 02:37:44.556 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node1 5.899s 2025-12-04 02:37:44.580 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ioh9hA==", "port": 30124 }, { "ipAddressV4": "CoAAPQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IqtEug==", "port": 30125 }, { "ipAddressV4": "CoAAPg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHF6nQ==", "port": 30126 }, { "ipAddressV4": "CoAAQA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "gtN3xw==", "port": 30127 }, { "ipAddressV4": "CoAAQQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "aMWy1A==", "port": 30128 }, { "ipAddressV4": "CoAAPw==", "port": 30128 }] }] }
node1 5.920s 2025-12-04 02:37:44.601 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/1/ConsistencyTestLog.csv
node1 5.920s 2025-12-04 02:37:44.601 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node1 5.934s 2025-12-04 02:37:44.615 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: b4562bb922f0edb0efd08d79c98af8bfb9d7c4fed7fbe29938029f74612b25fdcacc3002a55922c1da3253b65ce23040 (root) ConsistencyTestingToolState / inspire-jaguar-fun-energy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy
node0 5.967s 2025-12-04 02:37:44.648 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 5.969s 2025-12-04 02:37:44.650 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node0 5.970s 2025-12-04 02:37:44.651 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5.998s 2025-12-04 02:37:44.679 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node4 6.002s 2025-12-04 02:37:44.683 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node4 6.007s 2025-12-04 02:37:44.688 47 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 6.008s 2025-12-04 02:37:44.689 48 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 6.009s 2025-12-04 02:37:44.690 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 6.012s 2025-12-04 02:37:44.693 50 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 6.013s 2025-12-04 02:37:44.694 51 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 6.014s 2025-12-04 02:37:44.695 52 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 6.015s 2025-12-04 02:37:44.696 53 WARN STARTUP <<start-node-4>> PcesFileTracker: No preconsensus event files available
node4 6.016s 2025-12-04 02:37:44.697 54 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node4 6.017s 2025-12-04 02:37:44.698 55 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node4 6.019s 2025-12-04 02:37:44.700 56 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 6.020s 2025-12-04 02:37:44.701 57 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 153.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 6.025s 2025-12-04 02:37:44.706 58 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node1 6.149s 2025-12-04 02:37:44.830 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node1 6.154s 2025-12-04 02:37:44.835 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node1 6.159s 2025-12-04 02:37:44.840 47 INFO STARTUP <<start-node-1>> ConsistencyTestingToolMain: init called in Main for node 1.
node1 6.159s 2025-12-04 02:37:44.840 48 INFO STARTUP <<start-node-1>> SwirldsPlatform: Starting platform 1
node1 6.161s 2025-12-04 02:37:44.842 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node1 6.165s 2025-12-04 02:37:44.846 50 INFO STARTUP <<start-node-1>> CycleFinder: No cyclical back pressure detected in wiring model.
node1 6.166s 2025-12-04 02:37:44.847 51 INFO STARTUP <<start-node-1>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node1 6.166s 2025-12-04 02:37:44.847 52 INFO STARTUP <<start-node-1>> InputWireChecks: All input wires have been bound.
node1 6.168s 2025-12-04 02:37:44.849 53 WARN STARTUP <<start-node-1>> PcesFileTracker: No preconsensus event files available
node1 6.168s 2025-12-04 02:37:44.849 54 INFO STARTUP <<start-node-1>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node1 6.170s 2025-12-04 02:37:44.851 55 INFO STARTUP <<start-node-1>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node1 6.172s 2025-12-04 02:37:44.853 56 INFO STARTUP <<app: appMain 1>> ConsistencyTestingToolMain: run called in Main.
node1 6.173s 2025-12-04 02:37:44.854 57 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 185.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node1 6.179s 2025-12-04 02:37:44.860 58 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node3 6.207s 2025-12-04 02:37:44.888 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node2 6.220s 2025-12-04 02:37:44.901 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 6.232s 2025-12-04 02:37:44.913 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node2 6.240s 2025-12-04 02:37:44.921 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node2 6.251s 2025-12-04 02:37:44.932 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 6.253s 2025-12-04 02:37:44.934 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 6.289s 2025-12-04 02:37:44.970 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 6.291s 2025-12-04 02:37:44.972 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node3 6.292s 2025-12-04 02:37:44.973 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 6.863s 2025-12-04 02:37:45.544 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 6.877s 2025-12-04 02:37:45.558 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node0 6.885s 2025-12-04 02:37:45.566 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node0 6.900s 2025-12-04 02:37:45.581 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 6.902s 2025-12-04 02:37:45.583 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 7.080s 2025-12-04 02:37:45.761 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 7.091s 2025-12-04 02:37:45.772 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node3 7.097s 2025-12-04 02:37:45.778 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node3 7.108s 2025-12-04 02:37:45.789 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 7.110s 2025-12-04 02:37:45.791 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 7.367s 2025-12-04 02:37:46.048 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26403754] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=177400, randomLong=6678118395134873905, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=11580, randomLong=418401203164071879, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1147350, data=35, exception=null] OS Health Check Report - Complete (took 1022 ms)
node2 7.398s 2025-12-04 02:37:46.079 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node2 7.405s 2025-12-04 02:37:46.086 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node2 7.408s 2025-12-04 02:37:46.089 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node2 7.485s 2025-12-04 02:37:46.166 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ioh9hA==", "port": 30124 }, { "ipAddressV4": "CoAAPQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IqtEug==", "port": 30125 }, { "ipAddressV4": "CoAAPg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHF6nQ==", "port": 30126 }, { "ipAddressV4": "CoAAQA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "gtN3xw==", "port": 30127 }, { "ipAddressV4": "CoAAQQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "aMWy1A==", "port": 30128 }, { "ipAddressV4": "CoAAPw==", "port": 30128 }] }] }
node2 7.506s 2025-12-04 02:37:46.187 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/2/ConsistencyTestLog.csv
node2 7.507s 2025-12-04 02:37:46.188 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node2 7.521s 2025-12-04 02:37:46.202 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: b4562bb922f0edb0efd08d79c98af8bfb9d7c4fed7fbe29938029f74612b25fdcacc3002a55922c1da3253b65ce23040 (root) ConsistencyTestingToolState / inspire-jaguar-fun-energy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy
node2 7.718s 2025-12-04 02:37:46.399 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node2 7.722s 2025-12-04 02:37:46.403 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node2 7.727s 2025-12-04 02:37:46.408 47 INFO STARTUP <<start-node-2>> ConsistencyTestingToolMain: init called in Main for node 2.
node2 7.727s 2025-12-04 02:37:46.408 48 INFO STARTUP <<start-node-2>> SwirldsPlatform: Starting platform 2
node2 7.728s 2025-12-04 02:37:46.409 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node2 7.731s 2025-12-04 02:37:46.412 50 INFO STARTUP <<start-node-2>> CycleFinder: No cyclical back pressure detected in wiring model.
node2 7.733s 2025-12-04 02:37:46.414 51 INFO STARTUP <<start-node-2>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node2 7.733s 2025-12-04 02:37:46.414 52 INFO STARTUP <<start-node-2>> InputWireChecks: All input wires have been bound.
node2 7.735s 2025-12-04 02:37:46.416 53 WARN STARTUP <<start-node-2>> PcesFileTracker: No preconsensus event files available
node2 7.735s 2025-12-04 02:37:46.416 54 INFO STARTUP <<start-node-2>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node2 7.737s 2025-12-04 02:37:46.418 55 INFO STARTUP <<start-node-2>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node2 7.739s 2025-12-04 02:37:46.420 56 INFO STARTUP <<app: appMain 2>> ConsistencyTestingToolMain: run called in Main.
node2 7.741s 2025-12-04 02:37:46.422 57 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 164.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node2 7.746s 2025-12-04 02:37:46.427 58 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node0 8.048s 2025-12-04 02:37:46.729 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26165399] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=167489, randomLong=1676843745513264743, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=16850, randomLong=-1339232021235618129, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1281590, data=35, exception=null] OS Health Check Report - Complete (took 1025 ms)
node0 8.083s 2025-12-04 02:37:46.764 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node0 8.091s 2025-12-04 02:37:46.772 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node0 8.095s 2025-12-04 02:37:46.776 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node0 8.185s 2025-12-04 02:37:46.866 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ioh9hA==", "port": 30124 }, { "ipAddressV4": "CoAAPQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IqtEug==", "port": 30125 }, { "ipAddressV4": "CoAAPg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHF6nQ==", "port": 30126 }, { "ipAddressV4": "CoAAQA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "gtN3xw==", "port": 30127 }, { "ipAddressV4": "CoAAQQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "aMWy1A==", "port": 30128 }, { "ipAddressV4": "CoAAPw==", "port": 30128 }] }] }
node0 8.208s 2025-12-04 02:37:46.889 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/0/ConsistencyTestLog.csv
node0 8.209s 2025-12-04 02:37:46.890 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node3 8.213s 2025-12-04 02:37:46.894 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26261770] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=338560, randomLong=2739597892491505382, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=34200, randomLong=-8371909204353679164, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1348740, data=35, exception=null] OS Health Check Report - Complete (took 1025 ms)
node0 8.226s 2025-12-04 02:37:46.907 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: b4562bb922f0edb0efd08d79c98af8bfb9d7c4fed7fbe29938029f74612b25fdcacc3002a55922c1da3253b65ce23040 (root) ConsistencyTestingToolState / inspire-jaguar-fun-energy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy
node3 8.249s 2025-12-04 02:37:46.930 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node3 8.258s 2025-12-04 02:37:46.939 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node3 8.261s 2025-12-04 02:37:46.942 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node3 8.363s 2025-12-04 02:37:47.044 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ioh9hA==", "port": 30124 }, { "ipAddressV4": "CoAAPQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IqtEug==", "port": 30125 }, { "ipAddressV4": "CoAAPg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHF6nQ==", "port": 30126 }, { "ipAddressV4": "CoAAQA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "gtN3xw==", "port": 30127 }, { "ipAddressV4": "CoAAQQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "aMWy1A==", "port": 30128 }, { "ipAddressV4": "CoAAPw==", "port": 30128 }] }] }
node3 8.387s 2025-12-04 02:37:47.068 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/3/ConsistencyTestLog.csv
node3 8.387s 2025-12-04 02:37:47.068 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node3 8.404s 2025-12-04 02:37:47.085 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: b4562bb922f0edb0efd08d79c98af8bfb9d7c4fed7fbe29938029f74612b25fdcacc3002a55922c1da3253b65ce23040 (root) ConsistencyTestingToolState / inspire-jaguar-fun-energy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy
node0 8.457s 2025-12-04 02:37:47.138 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node0 8.463s 2025-12-04 02:37:47.144 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node0 8.468s 2025-12-04 02:37:47.149 47 INFO STARTUP <<start-node-0>> ConsistencyTestingToolMain: init called in Main for node 0.
node0 8.469s 2025-12-04 02:37:47.150 48 INFO STARTUP <<start-node-0>> SwirldsPlatform: Starting platform 0
node0 8.470s 2025-12-04 02:37:47.151 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node0 8.475s 2025-12-04 02:37:47.156 50 INFO STARTUP <<start-node-0>> CycleFinder: No cyclical back pressure detected in wiring model.
node0 8.476s 2025-12-04 02:37:47.157 51 INFO STARTUP <<start-node-0>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node0 8.477s 2025-12-04 02:37:47.158 52 INFO STARTUP <<start-node-0>> InputWireChecks: All input wires have been bound.
node0 8.478s 2025-12-04 02:37:47.159 53 WARN STARTUP <<start-node-0>> PcesFileTracker: No preconsensus event files available
node0 8.479s 2025-12-04 02:37:47.160 54 INFO STARTUP <<start-node-0>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node0 8.481s 2025-12-04 02:37:47.162 55 INFO STARTUP <<start-node-0>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node0 8.482s 2025-12-04 02:37:47.163 56 INFO STARTUP <<app: appMain 0>> ConsistencyTestingToolMain: run called in Main.
node0 8.485s 2025-12-04 02:37:47.166 57 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 198.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node0 8.491s 2025-12-04 02:37:47.172 58 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node3 8.632s 2025-12-04 02:37:47.313 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node3 8.639s 2025-12-04 02:37:47.320 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node3 8.645s 2025-12-04 02:37:47.326 47 INFO STARTUP <<start-node-3>> ConsistencyTestingToolMain: init called in Main for node 3.
node3 8.646s 2025-12-04 02:37:47.327 48 INFO STARTUP <<start-node-3>> SwirldsPlatform: Starting platform 3
node3 8.647s 2025-12-04 02:37:47.328 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node3 8.651s 2025-12-04 02:37:47.332 50 INFO STARTUP <<start-node-3>> CycleFinder: No cyclical back pressure detected in wiring model.
node3 8.652s 2025-12-04 02:37:47.333 51 INFO STARTUP <<start-node-3>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node3 8.653s 2025-12-04 02:37:47.334 52 INFO STARTUP <<start-node-3>> InputWireChecks: All input wires have been bound.
node3 8.654s 2025-12-04 02:37:47.335 53 WARN STARTUP <<start-node-3>> PcesFileTracker: No preconsensus event files available
node3 8.655s 2025-12-04 02:37:47.336 54 INFO STARTUP <<start-node-3>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node3 8.657s 2025-12-04 02:37:47.338 55 INFO STARTUP <<start-node-3>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node3 8.658s 2025-12-04 02:37:47.339 56 INFO STARTUP <<app: appMain 3>> ConsistencyTestingToolMain: run called in Main.
node3 8.661s 2025-12-04 02:37:47.342 57 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 188.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node3 8.666s 2025-12-04 02:37:47.347 58 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 9.021s 2025-12-04 02:37:47.702 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 9.024s 2025-12-04 02:37:47.705 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node1 9.171s 2025-12-04 02:37:47.852 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting1.csv' ]
node1 9.174s 2025-12-04 02:37:47.855 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node2 10.737s 2025-12-04 02:37:49.418 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting2.csv' ]
node2 10.740s 2025-12-04 02:37:49.421 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node0 11.481s 2025-12-04 02:37:50.162 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting0.csv' ]
node0 11.483s 2025-12-04 02:37:50.164 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node3 11.659s 2025-12-04 02:37:50.340 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting3.csv' ]
node3 11.662s 2025-12-04 02:37:50.343 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 16.115s 2025-12-04 02:37:54.796 61 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node1 16.268s 2025-12-04 02:37:54.949 61 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node2 17.834s 2025-12-04 02:37:56.515 61 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node0 18.578s 2025-12-04 02:37:57.259 61 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node3 18.754s 2025-12-04 02:37:57.435 61 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node2 19.904s 2025-12-04 02:37:58.585 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node0 19.924s 2025-12-04 02:37:58.605 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node3 20.005s 2025-12-04 02:37:58.686 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node1 20.046s 2025-12-04 02:37:58.727 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node4 20.120s 2025-12-04 02:37:58.801 62 INFO PLATFORM_STATUS <platformForkJoinThread-7> StatusStateMachine: Platform spent 4.0 s in CHECKING. Now in ACTIVE
node4 20.134s 2025-12-04 02:37:58.815 64 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node1 20.195s 2025-12-04 02:37:58.876 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1
node1 20.197s 2025-12-04 02:37:58.878 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node2 20.294s 2025-12-04 02:37:58.975 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1
node2 20.296s 2025-12-04 02:37:58.977 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node0 20.334s 2025-12-04 02:37:59.015 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1
node0 20.336s 2025-12-04 02:37:59.017 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node3 20.336s 2025-12-04 02:37:59.017 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1
node3 20.338s 2025-12-04 02:37:59.019 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node3 20.430s 2025-12-04 02:37:59.111 96 INFO PLATFORM_STATUS <platformForkJoinThread-8> StatusStateMachine: Platform spent 1.7 s in CHECKING. Now in ACTIVE
node4 20.446s 2025-12-04 02:37:59.127 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1
node4 20.447s 2025-12-04 02:37:59.128 80 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node1 20.448s 2025-12-04 02:37:59.129 110 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 4.2 s in CHECKING. Now in ACTIVE
node1 20.449s 2025-12-04 02:37:59.130 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node1 20.453s 2025-12-04 02:37:59.134 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-12-04T02:37:56.525093565Z Next consensus number: 1 Legacy running event hash: c44263df8e2fdb3325fbd07fb14ab5d8087bfd7039563348b2b69f7ae9fa35e102480df0889a00a9cdcbf7429404340a Legacy running event mnemonic: jelly-know-demand-often Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: a32f64e3e66478d88f70ee81b0c28a898a2070d8bd48f2d5e522a8c77b83237f23e5d671fc0ea4530b51e121ca268728 (root) ConsistencyTestingToolState / olive-organ-crime-brand 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 poverty-gadget-sheriff-claw 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node1 20.485s 2025-12-04 02:37:59.166 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+37+54.916827754Z_seq0_minr1_maxr501_orgn0.pces
node1 20.485s 2025-12-04 02:37:59.166 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+37+54.916827754Z_seq0_minr1_maxr501_orgn0.pces
node1 20.486s 2025-12-04 02:37:59.167 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 20.487s 2025-12-04 02:37:59.168 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 20.491s 2025-12-04 02:37:59.172 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 20.538s 2025-12-04 02:37:59.219 97 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 2.0 s in CHECKING. Now in ACTIVE
node2 20.542s 2025-12-04 02:37:59.223 109 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 2.7 s in CHECKING. Now in ACTIVE
node2 20.543s 2025-12-04 02:37:59.224 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node2 20.546s 2025-12-04 02:37:59.227 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-12-04T02:37:56.525093565Z Next consensus number: 1 Legacy running event hash: c44263df8e2fdb3325fbd07fb14ab5d8087bfd7039563348b2b69f7ae9fa35e102480df0889a00a9cdcbf7429404340a Legacy running event mnemonic: jelly-know-demand-often Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: a32f64e3e66478d88f70ee81b0c28a898a2070d8bd48f2d5e522a8c77b83237f23e5d671fc0ea4530b51e121ca268728 (root) ConsistencyTestingToolState / olive-organ-crime-brand 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 poverty-gadget-sheriff-claw 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node2 20.580s 2025-12-04 02:37:59.261 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+37+55.110908429Z_seq0_minr1_maxr501_orgn0.pces
node2 20.581s 2025-12-04 02:37:59.262 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+37+55.110908429Z_seq0_minr1_maxr501_orgn0.pces
node2 20.581s 2025-12-04 02:37:59.262 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 20.582s 2025-12-04 02:37:59.263 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 20.586s 2025-12-04 02:37:59.267 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 20.604s 2025-12-04 02:37:59.285 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node0 20.607s 2025-12-04 02:37:59.288 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-12-04T02:37:56.525093565Z Next consensus number: 1 Legacy running event hash: c44263df8e2fdb3325fbd07fb14ab5d8087bfd7039563348b2b69f7ae9fa35e102480df0889a00a9cdcbf7429404340a Legacy running event mnemonic: jelly-know-demand-often Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: a32f64e3e66478d88f70ee81b0c28a898a2070d8bd48f2d5e522a8c77b83237f23e5d671fc0ea4530b51e121ca268728 (root) ConsistencyTestingToolState / olive-organ-crime-brand 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 poverty-gadget-sheriff-claw 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node3 20.636s 2025-12-04 02:37:59.317 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node3 20.639s 2025-12-04 02:37:59.320 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-12-04T02:37:56.525093565Z Next consensus number: 1 Legacy running event hash: c44263df8e2fdb3325fbd07fb14ab5d8087bfd7039563348b2b69f7ae9fa35e102480df0889a00a9cdcbf7429404340a Legacy running event mnemonic: jelly-know-demand-often Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: a32f64e3e66478d88f70ee81b0c28a898a2070d8bd48f2d5e522a8c77b83237f23e5d671fc0ea4530b51e121ca268728 (root) ConsistencyTestingToolState / olive-organ-crime-brand 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 poverty-gadget-sheriff-claw 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node0 20.643s 2025-12-04 02:37:59.324 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+37+54.982574427Z_seq0_minr1_maxr501_orgn0.pces
node0 20.644s 2025-12-04 02:37:59.325 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+37+54.982574427Z_seq0_minr1_maxr501_orgn0.pces
node0 20.644s 2025-12-04 02:37:59.325 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 20.645s 2025-12-04 02:37:59.326 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 20.649s 2025-12-04 02:37:59.330 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 20.687s 2025-12-04 02:37:59.368 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+37+55.067700683Z_seq0_minr1_maxr501_orgn0.pces
node3 20.687s 2025-12-04 02:37:59.368 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+37+55.067700683Z_seq0_minr1_maxr501_orgn0.pces
node3 20.688s 2025-12-04 02:37:59.369 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 20.688s 2025-12-04 02:37:59.369 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node3 20.689s 2025-12-04 02:37:59.370 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 20.691s 2025-12-04 02:37:59.372 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-12-04T02:37:56.525093565Z Next consensus number: 1 Legacy running event hash: c44263df8e2fdb3325fbd07fb14ab5d8087bfd7039563348b2b69f7ae9fa35e102480df0889a00a9cdcbf7429404340a Legacy running event mnemonic: jelly-know-demand-often Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: a32f64e3e66478d88f70ee81b0c28a898a2070d8bd48f2d5e522a8c77b83237f23e5d671fc0ea4530b51e121ca268728 (root) ConsistencyTestingToolState / olive-organ-crime-brand 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 poverty-gadget-sheriff-claw 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node3 20.694s 2025-12-04 02:37:59.375 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 20.721s 2025-12-04 02:37:59.402 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/12/04/2025-12-04T02+37+54.823990167Z_seq0_minr1_maxr501_orgn0.pces
node4 20.721s 2025-12-04 02:37:59.402 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/4/2025/12/04/2025-12-04T02+37+54.823990167Z_seq0_minr1_maxr501_orgn0.pces
node4 20.721s 2025-12-04 02:37:59.402 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 20.722s 2025-12-04 02:37:59.403 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 20.726s 2025-12-04 02:37:59.407 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 23.295s 2025-12-04 02:38:01.976 158 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 6 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 23.338s 2025-12-04 02:38:02.019 150 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 6 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 23.343s 2025-12-04 02:38:02.024 152 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 6 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 23.380s 2025-12-04 02:38:02.061 162 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 6 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 23.443s 2025-12-04 02:38:02.124 160 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 6 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 23.664s 2025-12-04 02:38:02.345 160 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 6 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/6
node3 23.665s 2025-12-04 02:38:02.346 161 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 6
node1 23.712s 2025-12-04 02:38:02.393 162 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 6 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/6
node1 23.713s 2025-12-04 02:38:02.394 163 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 6
node3 23.770s 2025-12-04 02:38:02.451 192 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 6
node3 23.773s 2025-12-04 02:38:02.454 193 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 6 Timestamp: 2025-12-04T02:38:00.598069Z Next consensus number: 126 Legacy running event hash: 0bb67af7d608f9f817ce7d76a2f7edae161b15e58dec0f7fb3febb6267adeedfb2b4daf17f3d7d21e7eca47702e53091 Legacy running event mnemonic: item-galaxy-disease-scissors Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1933715405 Root hash: cc7790eba21aa436792b820aa86ef73cf95fe2a6a2a145e1461bc6414259d1f83769ca3fef5264a57c2a13b7706074df (root) ConsistencyTestingToolState / illness-tired-rubber-junk 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 prosper-vivid-soldier-pumpkin 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -6518265023548926736 /3 cement-keep-logic-road 4 StringLeaf 6 /4 practice-learn-art-tourist
node4 23.776s 2025-12-04 02:38:02.457 164 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 6 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/6
node4 23.777s 2025-12-04 02:38:02.458 165 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 6
node3 23.784s 2025-12-04 02:38:02.465 194 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+37+55.067700683Z_seq0_minr1_maxr501_orgn0.pces
node3 23.785s 2025-12-04 02:38:02.466 195 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+37+55.067700683Z_seq0_minr1_maxr501_orgn0.pces
node3 23.785s 2025-12-04 02:38:02.466 196 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 23.786s 2025-12-04 02:38:02.467 197 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 23.787s 2025-12-04 02:38:02.468 198 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 6 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/6 {"round":6,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/6/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 23.797s 2025-12-04 02:38:02.478 202 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 6
node1 23.800s 2025-12-04 02:38:02.481 203 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 6 Timestamp: 2025-12-04T02:38:00.598069Z Next consensus number: 126 Legacy running event hash: 0bb67af7d608f9f817ce7d76a2f7edae161b15e58dec0f7fb3febb6267adeedfb2b4daf17f3d7d21e7eca47702e53091 Legacy running event mnemonic: item-galaxy-disease-scissors Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1933715405 Root hash: cc7790eba21aa436792b820aa86ef73cf95fe2a6a2a145e1461bc6414259d1f83769ca3fef5264a57c2a13b7706074df (root) ConsistencyTestingToolState / illness-tired-rubber-junk 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 prosper-vivid-soldier-pumpkin 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -6518265023548926736 /3 cement-keep-logic-road 4 StringLeaf 6 /4 practice-learn-art-tourist
node1 23.809s 2025-12-04 02:38:02.490 204 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+37+54.916827754Z_seq0_minr1_maxr501_orgn0.pces
node1 23.809s 2025-12-04 02:38:02.490 205 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+37+54.916827754Z_seq0_minr1_maxr501_orgn0.pces
node1 23.809s 2025-12-04 02:38:02.490 206 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 23.810s 2025-12-04 02:38:02.491 207 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 23.810s 2025-12-04 02:38:02.491 208 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 6 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/6 {"round":6,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/6/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 23.862s 2025-12-04 02:38:02.543 204 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 6
node4 23.864s 2025-12-04 02:38:02.545 205 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 6 Timestamp: 2025-12-04T02:38:00.598069Z Next consensus number: 126 Legacy running event hash: 0bb67af7d608f9f817ce7d76a2f7edae161b15e58dec0f7fb3febb6267adeedfb2b4daf17f3d7d21e7eca47702e53091 Legacy running event mnemonic: item-galaxy-disease-scissors Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1933715405 Root hash: cc7790eba21aa436792b820aa86ef73cf95fe2a6a2a145e1461bc6414259d1f83769ca3fef5264a57c2a13b7706074df (root) ConsistencyTestingToolState / illness-tired-rubber-junk 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 prosper-vivid-soldier-pumpkin 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -6518265023548926736 /3 cement-keep-logic-road 4 StringLeaf 6 /4 practice-learn-art-tourist
node4 23.871s 2025-12-04 02:38:02.552 206 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/12/04/2025-12-04T02+37+54.823990167Z_seq0_minr1_maxr501_orgn0.pces
node4 23.872s 2025-12-04 02:38:02.553 207 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/4/2025/12/04/2025-12-04T02+37+54.823990167Z_seq0_minr1_maxr501_orgn0.pces
node4 23.872s 2025-12-04 02:38:02.553 208 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 23.873s 2025-12-04 02:38:02.554 209 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 23.873s 2025-12-04 02:38:02.554 210 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 6 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/6 {"round":6,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/6/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 23.926s 2025-12-04 02:38:02.607 164 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 6 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/6
node0 23.927s 2025-12-04 02:38:02.608 165 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 6
node2 23.996s 2025-12-04 02:38:02.677 162 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 6 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/6
node2 23.997s 2025-12-04 02:38:02.678 163 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 6
node0 24.032s 2025-12-04 02:38:02.713 198 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 6
node0 24.035s 2025-12-04 02:38:02.716 199 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 6 Timestamp: 2025-12-04T02:38:00.598069Z Next consensus number: 126 Legacy running event hash: 0bb67af7d608f9f817ce7d76a2f7edae161b15e58dec0f7fb3febb6267adeedfb2b4daf17f3d7d21e7eca47702e53091 Legacy running event mnemonic: item-galaxy-disease-scissors Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1933715405 Root hash: cc7790eba21aa436792b820aa86ef73cf95fe2a6a2a145e1461bc6414259d1f83769ca3fef5264a57c2a13b7706074df (root) ConsistencyTestingToolState / illness-tired-rubber-junk 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 prosper-vivid-soldier-pumpkin 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -6518265023548926736 /3 cement-keep-logic-road 4 StringLeaf 6 /4 practice-learn-art-tourist
node0 24.043s 2025-12-04 02:38:02.724 200 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+37+54.982574427Z_seq0_minr1_maxr501_orgn0.pces
node0 24.043s 2025-12-04 02:38:02.724 201 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+37+54.982574427Z_seq0_minr1_maxr501_orgn0.pces
node0 24.043s 2025-12-04 02:38:02.724 202 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 24.044s 2025-12-04 02:38:02.725 203 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 24.044s 2025-12-04 02:38:02.725 204 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 6 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/6 {"round":6,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/6/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 24.081s 2025-12-04 02:38:02.762 194 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 6
node2 24.083s 2025-12-04 02:38:02.764 195 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 6 Timestamp: 2025-12-04T02:38:00.598069Z Next consensus number: 126 Legacy running event hash: 0bb67af7d608f9f817ce7d76a2f7edae161b15e58dec0f7fb3febb6267adeedfb2b4daf17f3d7d21e7eca47702e53091 Legacy running event mnemonic: item-galaxy-disease-scissors Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1933715405 Root hash: cc7790eba21aa436792b820aa86ef73cf95fe2a6a2a145e1461bc6414259d1f83769ca3fef5264a57c2a13b7706074df (root) ConsistencyTestingToolState / illness-tired-rubber-junk 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 prosper-vivid-soldier-pumpkin 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -6518265023548926736 /3 cement-keep-logic-road 4 StringLeaf 6 /4 practice-learn-art-tourist
node2 24.092s 2025-12-04 02:38:02.773 196 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+37+55.110908429Z_seq0_minr1_maxr501_orgn0.pces
node2 24.092s 2025-12-04 02:38:02.773 197 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+37+55.110908429Z_seq0_minr1_maxr501_orgn0.pces
node2 24.093s 2025-12-04 02:38:02.774 198 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 24.093s 2025-12-04 02:38:02.774 199 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 24.094s 2025-12-04 02:38:02.775 200 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 6 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/6 {"round":6,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/6/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 1m 23.763s 2025-12-04 02:39:02.444 1216 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 99 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 1m 23.835s 2025-12-04 02:39:02.516 1226 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 99 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 1m 23.911s 2025-12-04 02:39:02.592 1220 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 99 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 1m 23.949s 2025-12-04 02:39:02.630 1220 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 99 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 1m 23.958s 2025-12-04 02:39:02.639 1218 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 99 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 1m 24.221s 2025-12-04 02:39:02.902 1229 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 99 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/99
node0 1m 24.222s 2025-12-04 02:39:02.903 1230 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 99
node4 1m 24.271s 2025-12-04 02:39:02.952 1229 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 99 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/99
node4 1m 24.272s 2025-12-04 02:39:02.953 1230 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 99
node1 1m 24.291s 2025-12-04 02:39:02.972 1235 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 99 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/99
node1 1m 24.292s 2025-12-04 02:39:02.973 1236 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 99
node3 1m 24.309s 2025-12-04 02:39:02.990 1225 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 99 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/99
node3 1m 24.310s 2025-12-04 02:39:02.991 1226 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 99
node0 1m 24.329s 2025-12-04 02:39:03.010 1281 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 99
node0 1m 24.332s 2025-12-04 02:39:03.013 1282 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 99 Timestamp: 2025-12-04T02:39:00.318286Z Next consensus number: 2622 Legacy running event hash: 048dab308a763212c67e8cb42cede7defd45ade345e32df9a7b1dde48d16f1010a8e8e1ccad444a99506e2cb2bc62657 Legacy running event mnemonic: museum-old-rally-drastic Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2031315257 Root hash: a6db5ea477d73ce5f97b6b6960fbc41a899b8368ec300d54c5670489857ab2a429a861769946e8fabbb9d507024edcfc (root) ConsistencyTestingToolState / caution-stand-wife-original 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 birth-seminar-happy-disagree 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf 5752389657988587717 /3 exclude-multiply-foam-bronze 4 StringLeaf 99 /4 crash-series-tilt-occur
node2 1m 24.342s 2025-12-04 02:39:03.023 1227 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 99 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/99
node2 1m 24.342s 2025-12-04 02:39:03.023 1228 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 99
node0 1m 24.345s 2025-12-04 02:39:03.026 1283 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+37+54.982574427Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 24.346s 2025-12-04 02:39:03.027 1284 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 72 File: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+37+54.982574427Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 24.346s 2025-12-04 02:39:03.027 1285 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 1m 24.348s 2025-12-04 02:39:03.029 1286 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 1m 24.349s 2025-12-04 02:39:03.030 1287 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 99 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/99 {"round":99,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/99/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 1m 24.353s 2025-12-04 02:39:03.034 1265 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 99
node4 1m 24.355s 2025-12-04 02:39:03.036 1266 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 99 Timestamp: 2025-12-04T02:39:00.318286Z Next consensus number: 2622 Legacy running event hash: 048dab308a763212c67e8cb42cede7defd45ade345e32df9a7b1dde48d16f1010a8e8e1ccad444a99506e2cb2bc62657 Legacy running event mnemonic: museum-old-rally-drastic Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2031315257 Root hash: a6db5ea477d73ce5f97b6b6960fbc41a899b8368ec300d54c5670489857ab2a429a861769946e8fabbb9d507024edcfc (root) ConsistencyTestingToolState / caution-stand-wife-original 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 birth-seminar-happy-disagree 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf 5752389657988587717 /3 exclude-multiply-foam-bronze 4 StringLeaf 99 /4 crash-series-tilt-occur
node4 1m 24.365s 2025-12-04 02:39:03.046 1267 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/12/04/2025-12-04T02+37+54.823990167Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 24.366s 2025-12-04 02:39:03.047 1268 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 72 File: data/saved/preconsensus-events/4/2025/12/04/2025-12-04T02+37+54.823990167Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 24.366s 2025-12-04 02:39:03.047 1269 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 1m 24.368s 2025-12-04 02:39:03.049 1270 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 1m 24.369s 2025-12-04 02:39:03.050 1271 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 99 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/99 {"round":99,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/99/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 1m 24.376s 2025-12-04 02:39:03.057 1287 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 99
node1 1m 24.379s 2025-12-04 02:39:03.060 1288 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 99 Timestamp: 2025-12-04T02:39:00.318286Z Next consensus number: 2622 Legacy running event hash: 048dab308a763212c67e8cb42cede7defd45ade345e32df9a7b1dde48d16f1010a8e8e1ccad444a99506e2cb2bc62657 Legacy running event mnemonic: museum-old-rally-drastic Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2031315257 Root hash: a6db5ea477d73ce5f97b6b6960fbc41a899b8368ec300d54c5670489857ab2a429a861769946e8fabbb9d507024edcfc (root) ConsistencyTestingToolState / caution-stand-wife-original 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 birth-seminar-happy-disagree 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf 5752389657988587717 /3 exclude-multiply-foam-bronze 4 StringLeaf 99 /4 crash-series-tilt-occur
node1 1m 24.388s 2025-12-04 02:39:03.069 1289 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+37+54.916827754Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 24.388s 2025-12-04 02:39:03.069 1290 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 72 File: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+37+54.916827754Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 24.388s 2025-12-04 02:39:03.069 1291 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 1m 24.390s 2025-12-04 02:39:03.071 1292 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 1m 24.391s 2025-12-04 02:39:03.072 1293 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 99 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/99 {"round":99,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/99/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 1m 24.397s 2025-12-04 02:39:03.078 1261 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 99
node3 1m 24.399s 2025-12-04 02:39:03.080 1262 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 99 Timestamp: 2025-12-04T02:39:00.318286Z Next consensus number: 2622 Legacy running event hash: 048dab308a763212c67e8cb42cede7defd45ade345e32df9a7b1dde48d16f1010a8e8e1ccad444a99506e2cb2bc62657 Legacy running event mnemonic: museum-old-rally-drastic Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2031315257 Root hash: a6db5ea477d73ce5f97b6b6960fbc41a899b8368ec300d54c5670489857ab2a429a861769946e8fabbb9d507024edcfc (root) ConsistencyTestingToolState / caution-stand-wife-original 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 birth-seminar-happy-disagree 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf 5752389657988587717 /3 exclude-multiply-foam-bronze 4 StringLeaf 99 /4 crash-series-tilt-occur
node3 1m 24.408s 2025-12-04 02:39:03.089 1263 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+37+55.067700683Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 24.408s 2025-12-04 02:39:03.089 1264 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 72 File: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+37+55.067700683Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 24.408s 2025-12-04 02:39:03.089 1265 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 1m 24.411s 2025-12-04 02:39:03.092 1266 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 1m 24.411s 2025-12-04 02:39:03.092 1267 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 99 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/99 {"round":99,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/99/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 1m 24.425s 2025-12-04 02:39:03.106 1263 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 99
node2 1m 24.427s 2025-12-04 02:39:03.108 1264 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 99 Timestamp: 2025-12-04T02:39:00.318286Z Next consensus number: 2622 Legacy running event hash: 048dab308a763212c67e8cb42cede7defd45ade345e32df9a7b1dde48d16f1010a8e8e1ccad444a99506e2cb2bc62657 Legacy running event mnemonic: museum-old-rally-drastic Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2031315257 Root hash: a6db5ea477d73ce5f97b6b6960fbc41a899b8368ec300d54c5670489857ab2a429a861769946e8fabbb9d507024edcfc (root) ConsistencyTestingToolState / caution-stand-wife-original 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 birth-seminar-happy-disagree 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf 5752389657988587717 /3 exclude-multiply-foam-bronze 4 StringLeaf 99 /4 crash-series-tilt-occur
node2 1m 24.435s 2025-12-04 02:39:03.116 1265 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+37+55.110908429Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 24.435s 2025-12-04 02:39:03.116 1266 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 72 File: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+37+55.110908429Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 24.436s 2025-12-04 02:39:03.117 1267 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 1m 24.438s 2025-12-04 02:39:03.119 1268 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 1m 24.440s 2025-12-04 02:39:03.121 1269 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 99 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/99 {"round":99,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/99/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 2m 22.534s 2025-12-04 02:40:01.215 2273 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 190 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 2m 22.690s 2025-12-04 02:40:01.371 2297 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 190 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 2m 22.785s 2025-12-04 02:40:01.466 2283 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 190 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 2m 22.820s 2025-12-04 02:40:01.501 2289 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 190 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 2m 22.832s 2025-12-04 02:40:01.513 2301 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 190 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 2m 23.129s 2025-12-04 02:40:01.810 2304 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 190 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/190
node2 2m 23.131s 2025-12-04 02:40:01.812 2305 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 190
node4 2m 23.170s 2025-12-04 02:40:01.851 2310 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 190 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/190
node4 2m 23.172s 2025-12-04 02:40:01.853 2311 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 190
node0 2m 23.199s 2025-12-04 02:40:01.880 2286 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 190 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/190
node0 2m 23.200s 2025-12-04 02:40:01.881 2287 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 190
node2 2m 23.218s 2025-12-04 02:40:01.899 2340 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 190
node2 2m 23.220s 2025-12-04 02:40:01.901 2341 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 190 Timestamp: 2025-12-04T02:40:00.048504Z Next consensus number: 5101 Legacy running event hash: 15d31a81dc6c4a0b4f0a176a3b87284c688232e2d65ea7eeb8217683ebc23a680278efe85e2859006936397a9a266cee Legacy running event mnemonic: risk-expand-income-door Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1109048935 Root hash: 30537c7ac026a72bf50d2a97774d6bf76957c87c878afba0f4e4691fc56422b84212662a4bd4a0239191e3b180ff1442 (root) ConsistencyTestingToolState / apology-monster-win-boy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 fetch-jealous-near-eager 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -9177701731925398757 /3 inherit-cream-edit-eye 4 StringLeaf 190 /4 derive-person-belt-ancient
node2 2m 23.230s 2025-12-04 02:40:01.911 2342 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+37+55.110908429Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 23.230s 2025-12-04 02:40:01.911 2343 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 163 File: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+37+55.110908429Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 23.230s 2025-12-04 02:40:01.911 2344 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 2m 23.234s 2025-12-04 02:40:01.915 2345 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 2m 23.235s 2025-12-04 02:40:01.916 2346 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 190 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/190 {"round":190,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/190/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 2m 23.260s 2025-12-04 02:40:01.941 2350 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 190
node4 2m 23.262s 2025-12-04 02:40:01.943 2351 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 190 Timestamp: 2025-12-04T02:40:00.048504Z Next consensus number: 5101 Legacy running event hash: 15d31a81dc6c4a0b4f0a176a3b87284c688232e2d65ea7eeb8217683ebc23a680278efe85e2859006936397a9a266cee Legacy running event mnemonic: risk-expand-income-door Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1109048935 Root hash: 30537c7ac026a72bf50d2a97774d6bf76957c87c878afba0f4e4691fc56422b84212662a4bd4a0239191e3b180ff1442 (root) ConsistencyTestingToolState / apology-monster-win-boy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 fetch-jealous-near-eager 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -9177701731925398757 /3 inherit-cream-edit-eye 4 StringLeaf 190 /4 derive-person-belt-ancient
node4 2m 23.270s 2025-12-04 02:40:01.951 2352 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/12/04/2025-12-04T02+37+54.823990167Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 23.270s 2025-12-04 02:40:01.951 2353 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 163 File: data/saved/preconsensus-events/4/2025/12/04/2025-12-04T02+37+54.823990167Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 23.270s 2025-12-04 02:40:01.951 2354 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 2m 23.274s 2025-12-04 02:40:01.955 2355 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 2m 23.274s 2025-12-04 02:40:01.955 2356 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 190 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/190 {"round":190,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/190/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 2m 23.288s 2025-12-04 02:40:01.969 2326 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 190
node0 2m 23.291s 2025-12-04 02:40:01.972 2327 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 190 Timestamp: 2025-12-04T02:40:00.048504Z Next consensus number: 5101 Legacy running event hash: 15d31a81dc6c4a0b4f0a176a3b87284c688232e2d65ea7eeb8217683ebc23a680278efe85e2859006936397a9a266cee Legacy running event mnemonic: risk-expand-income-door Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1109048935 Root hash: 30537c7ac026a72bf50d2a97774d6bf76957c87c878afba0f4e4691fc56422b84212662a4bd4a0239191e3b180ff1442 (root) ConsistencyTestingToolState / apology-monster-win-boy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 fetch-jealous-near-eager 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -9177701731925398757 /3 inherit-cream-edit-eye 4 StringLeaf 190 /4 derive-person-belt-ancient
node0 2m 23.298s 2025-12-04 02:40:01.979 2328 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+37+54.982574427Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 23.298s 2025-12-04 02:40:01.979 2329 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 163 File: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+37+54.982574427Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 23.298s 2025-12-04 02:40:01.979 2330 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 2m 23.302s 2025-12-04 02:40:01.983 2331 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 2m 23.303s 2025-12-04 02:40:01.984 2332 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 190 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/190 {"round":190,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/190/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 2m 23.325s 2025-12-04 02:40:02.006 2286 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 190 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/190
node1 2m 23.326s 2025-12-04 02:40:02.007 2287 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 190
node3 2m 23.347s 2025-12-04 02:40:02.028 2292 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 190 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/190
node3 2m 23.348s 2025-12-04 02:40:02.029 2293 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 190
node1 2m 23.410s 2025-12-04 02:40:02.091 2326 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 190
node1 2m 23.412s 2025-12-04 02:40:02.093 2327 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 190 Timestamp: 2025-12-04T02:40:00.048504Z Next consensus number: 5101 Legacy running event hash: 15d31a81dc6c4a0b4f0a176a3b87284c688232e2d65ea7eeb8217683ebc23a680278efe85e2859006936397a9a266cee Legacy running event mnemonic: risk-expand-income-door Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1109048935 Root hash: 30537c7ac026a72bf50d2a97774d6bf76957c87c878afba0f4e4691fc56422b84212662a4bd4a0239191e3b180ff1442 (root) ConsistencyTestingToolState / apology-monster-win-boy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 fetch-jealous-near-eager 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -9177701731925398757 /3 inherit-cream-edit-eye 4 StringLeaf 190 /4 derive-person-belt-ancient
node1 2m 23.421s 2025-12-04 02:40:02.102 2328 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+37+54.916827754Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 23.421s 2025-12-04 02:40:02.102 2329 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 163 File: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+37+54.916827754Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 23.421s 2025-12-04 02:40:02.102 2330 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 2m 23.425s 2025-12-04 02:40:02.106 2331 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 2m 23.426s 2025-12-04 02:40:02.107 2332 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 190 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/190 {"round":190,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/190/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 2m 23.433s 2025-12-04 02:40:02.114 2327 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 190
node3 2m 23.435s 2025-12-04 02:40:02.116 2328 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 190 Timestamp: 2025-12-04T02:40:00.048504Z Next consensus number: 5101 Legacy running event hash: 15d31a81dc6c4a0b4f0a176a3b87284c688232e2d65ea7eeb8217683ebc23a680278efe85e2859006936397a9a266cee Legacy running event mnemonic: risk-expand-income-door Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1109048935 Root hash: 30537c7ac026a72bf50d2a97774d6bf76957c87c878afba0f4e4691fc56422b84212662a4bd4a0239191e3b180ff1442 (root) ConsistencyTestingToolState / apology-monster-win-boy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 fetch-jealous-near-eager 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -9177701731925398757 /3 inherit-cream-edit-eye 4 StringLeaf 190 /4 derive-person-belt-ancient
node3 2m 23.442s 2025-12-04 02:40:02.123 2329 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+37+55.067700683Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 23.443s 2025-12-04 02:40:02.124 2330 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 163 File: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+37+55.067700683Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 23.443s 2025-12-04 02:40:02.124 2331 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 2m 23.447s 2025-12-04 02:40:02.128 2332 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 2m 23.447s 2025-12-04 02:40:02.128 2333 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 190 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/190 {"round":190,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/190/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 3m 14.609s 2025-12-04 02:40:53.290 3236 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 0 to 4>> NetworkUtils: Connection broken: 0 -> 4
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:40:53.287814250Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:40:53.287814250Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more
node1 3m 14.609s 2025-12-04 02:40:53.290 3248 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 1 to 4>> NetworkUtils: Connection broken: 1 -> 4
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:40:53.287936879Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:40:53.287936879Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 12 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$sendEventsTheyNeed$8(SyncUtils.java:234) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more
node2 3m 14.609s 2025-12-04 02:40:53.290 3246 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 2 to 4>> NetworkUtils: Connection broken: 2 -> 4
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:40:53.287991115Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:40:53.287991115Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:180) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readBoolean(DataInputStream.java:255) at org.hiero.base.io.streams.AugmentedDataInputStream.readBoolean(AugmentedDataInputStream.java:137) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readMyTipsTheyHave$7(SyncUtils.java:163) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more
node3 3m 14.610s 2025-12-04 02:40:53.291 3254 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 3 to 4>> NetworkUtils: Connection broken: 3 -> 4
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:40:53.288478141Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:40:53.288478141Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:180) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readBoolean(DataInputStream.java:255) at org.hiero.base.io.streams.AugmentedDataInputStream.readBoolean(AugmentedDataInputStream.java:137) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readMyTipsTheyHave$7(SyncUtils.java:163) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more
node2 3m 23.150s 2025-12-04 02:41:01.831 3394 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 284 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 3m 23.400s 2025-12-04 02:41:02.081 3400 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 284 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 3m 23.417s 2025-12-04 02:41:02.098 3382 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 284 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 3m 23.474s 2025-12-04 02:41:02.155 3392 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 284 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 3m 23.728s 2025-12-04 02:41:02.409 3385 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 284 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/284
node0 3m 23.728s 2025-12-04 02:41:02.409 3386 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 284
node3 3m 23.810s 2025-12-04 02:41:02.491 3403 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 284 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/284
node3 3m 23.811s 2025-12-04 02:41:02.492 3404 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 284
node0 3m 23.819s 2025-12-04 02:41:02.500 3417 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 284
node0 3m 23.821s 2025-12-04 02:41:02.502 3418 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 284 Timestamp: 2025-12-04T02:41:00.578487825Z Next consensus number: 7522 Legacy running event hash: 89f1151ec5c503fe2cd9cae4a428abcf44adadb5f9950219a763291ff63e726b9d6336b23a894a7925c032da9b7f6534 Legacy running event mnemonic: sea-phrase-electric-lemon Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1692593896 Root hash: 16ee50719b3bf152c5e896302411493ec78de806efc5fb2c093a5bc99df8bb68e248b531721ea62aca3be9aa3488d4fd (root) ConsistencyTestingToolState / tape-athlete-obvious-still 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 cage-bird-opera-call 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf 1755993057610220009 /3 switch-thought-iron-goat 4 StringLeaf 284 /4 submit-stairs-weather-among
node0 3m 23.830s 2025-12-04 02:41:02.511 3419 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+37+54.982574427Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 23.830s 2025-12-04 02:41:02.511 3420 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 256 File: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+37+54.982574427Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 23.830s 2025-12-04 02:41:02.511 3421 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 3m 23.836s 2025-12-04 02:41:02.517 3422 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 3m 23.837s 2025-12-04 02:41:02.518 3423 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 284 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/284 {"round":284,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/284/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 3m 23.881s 2025-12-04 02:41:02.562 3395 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 284 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/284
node1 3m 23.882s 2025-12-04 02:41:02.563 3396 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 284
node3 3m 23.910s 2025-12-04 02:41:02.591 3443 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 284
node3 3m 23.912s 2025-12-04 02:41:02.593 3444 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 284 Timestamp: 2025-12-04T02:41:00.578487825Z Next consensus number: 7522 Legacy running event hash: 89f1151ec5c503fe2cd9cae4a428abcf44adadb5f9950219a763291ff63e726b9d6336b23a894a7925c032da9b7f6534 Legacy running event mnemonic: sea-phrase-electric-lemon Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1692593896 Root hash: 16ee50719b3bf152c5e896302411493ec78de806efc5fb2c093a5bc99df8bb68e248b531721ea62aca3be9aa3488d4fd (root) ConsistencyTestingToolState / tape-athlete-obvious-still 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 cage-bird-opera-call 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf 1755993057610220009 /3 switch-thought-iron-goat 4 StringLeaf 284 /4 submit-stairs-weather-among
node3 3m 23.918s 2025-12-04 02:41:02.599 3445 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+37+55.067700683Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 23.918s 2025-12-04 02:41:02.599 3446 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 256 File: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+37+55.067700683Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 23.918s 2025-12-04 02:41:02.599 3447 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 3m 23.924s 2025-12-04 02:41:02.605 3448 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 3m 23.924s 2025-12-04 02:41:02.605 3449 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 284 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/284 {"round":284,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/284/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 3m 23.985s 2025-12-04 02:41:02.666 3435 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 284
node1 3m 23.987s 2025-12-04 02:41:02.668 3436 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 284 Timestamp: 2025-12-04T02:41:00.578487825Z Next consensus number: 7522 Legacy running event hash: 89f1151ec5c503fe2cd9cae4a428abcf44adadb5f9950219a763291ff63e726b9d6336b23a894a7925c032da9b7f6534 Legacy running event mnemonic: sea-phrase-electric-lemon Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1692593896 Root hash: 16ee50719b3bf152c5e896302411493ec78de806efc5fb2c093a5bc99df8bb68e248b531721ea62aca3be9aa3488d4fd (root) ConsistencyTestingToolState / tape-athlete-obvious-still 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 cage-bird-opera-call 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf 1755993057610220009 /3 switch-thought-iron-goat 4 StringLeaf 284 /4 submit-stairs-weather-among
node1 3m 23.996s 2025-12-04 02:41:02.677 3437 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+37+54.916827754Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 23.997s 2025-12-04 02:41:02.678 3438 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 256 File: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+37+54.916827754Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 23.997s 2025-12-04 02:41:02.678 3439 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 3m 24.004s 2025-12-04 02:41:02.685 3440 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 3m 24.004s 2025-12-04 02:41:02.685 3441 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 284 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/284 {"round":284,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/284/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 3m 24.050s 2025-12-04 02:41:02.731 3407 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 284 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/284
node2 3m 24.051s 2025-12-04 02:41:02.732 3408 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 284
node2 3m 24.142s 2025-12-04 02:41:02.823 3452 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 284
node2 3m 24.144s 2025-12-04 02:41:02.825 3453 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 284 Timestamp: 2025-12-04T02:41:00.578487825Z Next consensus number: 7522 Legacy running event hash: 89f1151ec5c503fe2cd9cae4a428abcf44adadb5f9950219a763291ff63e726b9d6336b23a894a7925c032da9b7f6534 Legacy running event mnemonic: sea-phrase-electric-lemon Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1692593896 Root hash: 16ee50719b3bf152c5e896302411493ec78de806efc5fb2c093a5bc99df8bb68e248b531721ea62aca3be9aa3488d4fd (root) ConsistencyTestingToolState / tape-athlete-obvious-still 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 cage-bird-opera-call 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf 1755993057610220009 /3 switch-thought-iron-goat 4 StringLeaf 284 /4 submit-stairs-weather-among
node2 3m 24.151s 2025-12-04 02:41:02.832 3454 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+37+55.110908429Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 24.152s 2025-12-04 02:41:02.833 3455 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 256 File: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+37+55.110908429Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 24.152s 2025-12-04 02:41:02.833 3456 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 3m 24.157s 2025-12-04 02:41:02.838 3457 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 3m 24.158s 2025-12-04 02:41:02.839 3458 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 284 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/284 {"round":284,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/284/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 4m 23.450s 2025-12-04 02:42:02.131 4454 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 374 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 4m 23.494s 2025-12-04 02:42:02.175 4460 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 374 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 4m 23.587s 2025-12-04 02:42:02.268 4482 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 374 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 4m 23.703s 2025-12-04 02:42:02.384 4498 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 374 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 4m 23.976s 2025-12-04 02:42:02.657 4501 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 374 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/374
node2 4m 23.977s 2025-12-04 02:42:02.658 4502 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 374
node0 4m 24.010s 2025-12-04 02:42:02.691 4485 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 374 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/374
node0 4m 24.011s 2025-12-04 02:42:02.692 4486 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 374
node1 4m 24.046s 2025-12-04 02:42:02.727 4473 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 374 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/374
node1 4m 24.047s 2025-12-04 02:42:02.728 4474 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 374
node2 4m 24.066s 2025-12-04 02:42:02.747 4533 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 374
node2 4m 24.068s 2025-12-04 02:42:02.749 4534 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 374 Timestamp: 2025-12-04T02:42:00.714617Z Next consensus number: 9062 Legacy running event hash: 23b03718b8f3baf84853d268f800d57e37605cecdedd1f0db5d59fca5a7360eeb3c9c1f9fed17de62ffa32acac1a59b6 Legacy running event mnemonic: invite-toast-ticket-patient Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2033230271 Root hash: 2b98819caf568a25d6347ffbab5a1638d0f077f410e609bb9144dc7615549f84b00aa19f9203072ce8a101b9b56078a4 (root) ConsistencyTestingToolState / indoor-smart-best-rocket 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wet-motor-bread-visit 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -5398528371430497024 /3 duck-social-knife-glad 4 StringLeaf 374 /4 seek-act-fatal-suspect
node2 4m 24.074s 2025-12-04 02:42:02.755 4535 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+37+55.110908429Z_seq0_minr1_maxr501_orgn0.pces
node2 4m 24.074s 2025-12-04 02:42:02.755 4536 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 347 File: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+37+55.110908429Z_seq0_minr1_maxr501_orgn0.pces
node2 4m 24.075s 2025-12-04 02:42:02.756 4537 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 4m 24.081s 2025-12-04 02:42:02.762 4538 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 4m 24.082s 2025-12-04 02:42:02.763 4539 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 374 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/374 {"round":374,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/374/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 4m 24.083s 2025-12-04 02:42:02.764 4540 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1
node0 4m 24.116s 2025-12-04 02:42:02.797 4525 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 374
node0 4m 24.119s 2025-12-04 02:42:02.800 4526 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 374 Timestamp: 2025-12-04T02:42:00.714617Z Next consensus number: 9062 Legacy running event hash: 23b03718b8f3baf84853d268f800d57e37605cecdedd1f0db5d59fca5a7360eeb3c9c1f9fed17de62ffa32acac1a59b6 Legacy running event mnemonic: invite-toast-ticket-patient Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2033230271 Root hash: 2b98819caf568a25d6347ffbab5a1638d0f077f410e609bb9144dc7615549f84b00aa19f9203072ce8a101b9b56078a4 (root) ConsistencyTestingToolState / indoor-smart-best-rocket 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wet-motor-bread-visit 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -5398528371430497024 /3 duck-social-knife-glad 4 StringLeaf 374 /4 seek-act-fatal-suspect
node0 4m 24.128s 2025-12-04 02:42:02.809 4527 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+37+54.982574427Z_seq0_minr1_maxr501_orgn0.pces
node0 4m 24.129s 2025-12-04 02:42:02.810 4528 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 347 File: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+37+54.982574427Z_seq0_minr1_maxr501_orgn0.pces
node0 4m 24.129s 2025-12-04 02:42:02.810 4529 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 4m 24.136s 2025-12-04 02:42:02.817 4530 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 4m 24.137s 2025-12-04 02:42:02.818 4531 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 374 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/374 {"round":374,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/374/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 4m 24.138s 2025-12-04 02:42:02.819 4508 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 374
node0 4m 24.139s 2025-12-04 02:42:02.820 4532 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1
node1 4m 24.140s 2025-12-04 02:42:02.821 4509 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 374 Timestamp: 2025-12-04T02:42:00.714617Z Next consensus number: 9062 Legacy running event hash: 23b03718b8f3baf84853d268f800d57e37605cecdedd1f0db5d59fca5a7360eeb3c9c1f9fed17de62ffa32acac1a59b6 Legacy running event mnemonic: invite-toast-ticket-patient Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2033230271 Root hash: 2b98819caf568a25d6347ffbab5a1638d0f077f410e609bb9144dc7615549f84b00aa19f9203072ce8a101b9b56078a4 (root) ConsistencyTestingToolState / indoor-smart-best-rocket 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wet-motor-bread-visit 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -5398528371430497024 /3 duck-social-knife-glad 4 StringLeaf 374 /4 seek-act-fatal-suspect
node3 4m 24.146s 2025-12-04 02:42:02.827 4457 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 374 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/374
node3 4m 24.146s 2025-12-04 02:42:02.827 4459 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 374
node1 4m 24.147s 2025-12-04 02:42:02.828 4510 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+37+54.916827754Z_seq0_minr1_maxr501_orgn0.pces
node1 4m 24.147s 2025-12-04 02:42:02.828 4511 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 347 File: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+37+54.916827754Z_seq0_minr1_maxr501_orgn0.pces
node1 4m 24.147s 2025-12-04 02:42:02.828 4512 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 4m 24.154s 2025-12-04 02:42:02.835 4513 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 4m 24.155s 2025-12-04 02:42:02.836 4514 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 374 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/374 {"round":374,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/374/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 4m 24.156s 2025-12-04 02:42:02.837 4515 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1
node3 4m 24.246s 2025-12-04 02:42:02.927 4494 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 374
node3 4m 24.248s 2025-12-04 02:42:02.929 4495 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 374 Timestamp: 2025-12-04T02:42:00.714617Z Next consensus number: 9062 Legacy running event hash: 23b03718b8f3baf84853d268f800d57e37605cecdedd1f0db5d59fca5a7360eeb3c9c1f9fed17de62ffa32acac1a59b6 Legacy running event mnemonic: invite-toast-ticket-patient Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2033230271 Root hash: 2b98819caf568a25d6347ffbab5a1638d0f077f410e609bb9144dc7615549f84b00aa19f9203072ce8a101b9b56078a4 (root) ConsistencyTestingToolState / indoor-smart-best-rocket 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wet-motor-bread-visit 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -5398528371430497024 /3 duck-social-knife-glad 4 StringLeaf 374 /4 seek-act-fatal-suspect
node3 4m 24.256s 2025-12-04 02:42:02.937 4506 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+37+55.067700683Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 24.256s 2025-12-04 02:42:02.937 4507 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 347 File: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+37+55.067700683Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 24.257s 2025-12-04 02:42:02.938 4508 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 4m 24.263s 2025-12-04 02:42:02.944 4509 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 4m 24.264s 2025-12-04 02:42:02.945 4510 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 374 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/374 {"round":374,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/374/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 4m 24.266s 2025-12-04 02:42:02.947 4511 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1
node0 5m 22.625s 2025-12-04 02:43:01.306 5672 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 473 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 5m 22.753s 2025-12-04 02:43:01.434 5678 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 473 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 5m 22.840s 2025-12-04 02:43:01.521 5648 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 473 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 5m 22.893s 2025-12-04 02:43:01.574 5700 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 473 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 5m 23.035s 2025-12-04 02:43:01.716 5681 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 473 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/473
node1 5m 23.035s 2025-12-04 02:43:01.716 5682 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 473
node0 5m 23.105s 2025-12-04 02:43:01.786 5675 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 473 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/473
node2 5m 23.105s 2025-12-04 02:43:01.786 5703 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 473 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/473
node0 5m 23.106s 2025-12-04 02:43:01.787 5676 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 473
node2 5m 23.106s 2025-12-04 02:43:01.787 5704 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 473
node1 5m 23.124s 2025-12-04 02:43:01.805 5713 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 473
node1 5m 23.126s 2025-12-04 02:43:01.807 5714 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 473 Timestamp: 2025-12-04T02:43:00.164598Z Next consensus number: 10584 Legacy running event hash: 2ca6320de4d8298d0cac53cad6b689620709becce01f05ae3b910a7dc26f145ba247b1abddf1e073c8ad5a0262fa51d5 Legacy running event mnemonic: network-double-govern-coyote Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1540079312 Root hash: 5cb336799d5957b9cabe477bead7e88ae8c1bf8e87237758de0b54b2c19ab330a32b861645f1e49bc84038555d513dd9 (root) ConsistencyTestingToolState / nose-song-female-warm 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 market-real-coffee-baby 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -6437258573439123036 /3 lazy-survey-suffer-attitude 4 StringLeaf 473 /4 school-access-hour-dinosaur
node1 5m 23.132s 2025-12-04 02:43:01.813 5715 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+37+54.916827754Z_seq0_minr1_maxr501_orgn0.pces
node1 5m 23.133s 2025-12-04 02:43:01.814 5716 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 446 File: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+37+54.916827754Z_seq0_minr1_maxr501_orgn0.pces
node1 5m 23.133s 2025-12-04 02:43:01.814 5717 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 5m 23.140s 2025-12-04 02:43:01.821 5718 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 5m 23.141s 2025-12-04 02:43:01.822 5719 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 473 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/473 {"round":473,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/473/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 5m 23.142s 2025-12-04 02:43:01.823 5720 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/6
node2 5m 23.193s 2025-12-04 02:43:01.874 5743 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 473
node2 5m 23.195s 2025-12-04 02:43:01.876 5744 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 473 Timestamp: 2025-12-04T02:43:00.164598Z Next consensus number: 10584 Legacy running event hash: 2ca6320de4d8298d0cac53cad6b689620709becce01f05ae3b910a7dc26f145ba247b1abddf1e073c8ad5a0262fa51d5 Legacy running event mnemonic: network-double-govern-coyote Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1540079312 Root hash: 5cb336799d5957b9cabe477bead7e88ae8c1bf8e87237758de0b54b2c19ab330a32b861645f1e49bc84038555d513dd9 (root) ConsistencyTestingToolState / nose-song-female-warm 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 market-real-coffee-baby 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -6437258573439123036 /3 lazy-survey-suffer-attitude 4 StringLeaf 473 /4 school-access-hour-dinosaur
node2 5m 23.202s 2025-12-04 02:43:01.883 5745 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+37+55.110908429Z_seq0_minr1_maxr501_orgn0.pces
node2 5m 23.202s 2025-12-04 02:43:01.883 5746 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 446 File: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+37+55.110908429Z_seq0_minr1_maxr501_orgn0.pces
node2 5m 23.202s 2025-12-04 02:43:01.883 5747 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 5m 23.210s 2025-12-04 02:43:01.891 5748 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 5m 23.210s 2025-12-04 02:43:01.891 5749 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 473 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/473 {"round":473,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/473/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 5m 23.211s 2025-12-04 02:43:01.892 5710 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 473
node2 5m 23.212s 2025-12-04 02:43:01.893 5750 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/6
node0 5m 23.213s 2025-12-04 02:43:01.894 5711 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 473 Timestamp: 2025-12-04T02:43:00.164598Z Next consensus number: 10584 Legacy running event hash: 2ca6320de4d8298d0cac53cad6b689620709becce01f05ae3b910a7dc26f145ba247b1abddf1e073c8ad5a0262fa51d5 Legacy running event mnemonic: network-double-govern-coyote Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1540079312 Root hash: 5cb336799d5957b9cabe477bead7e88ae8c1bf8e87237758de0b54b2c19ab330a32b861645f1e49bc84038555d513dd9 (root) ConsistencyTestingToolState / nose-song-female-warm 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 market-real-coffee-baby 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -6437258573439123036 /3 lazy-survey-suffer-attitude 4 StringLeaf 473 /4 school-access-hour-dinosaur
node0 5m 23.221s 2025-12-04 02:43:01.902 5712 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+37+54.982574427Z_seq0_minr1_maxr501_orgn0.pces
node0 5m 23.222s 2025-12-04 02:43:01.903 5713 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 446 File: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+37+54.982574427Z_seq0_minr1_maxr501_orgn0.pces
node0 5m 23.222s 2025-12-04 02:43:01.903 5714 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 5m 23.230s 2025-12-04 02:43:01.911 5715 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 5m 23.231s 2025-12-04 02:43:01.912 5716 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 473 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/473 {"round":473,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/473/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 5m 23.232s 2025-12-04 02:43:01.913 5717 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/6
node3 5m 23.339s 2025-12-04 02:43:02.020 5651 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 473 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/473
node3 5m 23.339s 2025-12-04 02:43:02.020 5653 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 473
node3 5m 23.436s 2025-12-04 02:43:02.117 5688 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 473
node3 5m 23.438s 2025-12-04 02:43:02.119 5689 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 473 Timestamp: 2025-12-04T02:43:00.164598Z Next consensus number: 10584 Legacy running event hash: 2ca6320de4d8298d0cac53cad6b689620709becce01f05ae3b910a7dc26f145ba247b1abddf1e073c8ad5a0262fa51d5 Legacy running event mnemonic: network-double-govern-coyote Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1540079312 Root hash: 5cb336799d5957b9cabe477bead7e88ae8c1bf8e87237758de0b54b2c19ab330a32b861645f1e49bc84038555d513dd9 (root) ConsistencyTestingToolState / nose-song-female-warm 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 market-real-coffee-baby 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -6437258573439123036 /3 lazy-survey-suffer-attitude 4 StringLeaf 473 /4 school-access-hour-dinosaur
node3 5m 23.444s 2025-12-04 02:43:02.125 5690 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+37+55.067700683Z_seq0_minr1_maxr501_orgn0.pces
node3 5m 23.445s 2025-12-04 02:43:02.126 5691 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 446 File: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+37+55.067700683Z_seq0_minr1_maxr501_orgn0.pces
node3 5m 23.445s 2025-12-04 02:43:02.126 5692 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 5m 23.453s 2025-12-04 02:43:02.134 5693 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 5m 23.453s 2025-12-04 02:43:02.134 5694 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 473 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/473 {"round":473,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/473/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 5m 23.455s 2025-12-04 02:43:02.136 5705 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/6
node4 5m 52.885s 2025-12-04 02:43:31.566 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 5m 52.969s 2025-12-04 02:43:31.650 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 5m 52.984s 2025-12-04 02:43:31.665 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 53.091s 2025-12-04 02:43:31.772 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 5m 53.096s 2025-12-04 02:43:31.777 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 5m 53.109s 2025-12-04 02:43:31.790 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 53.518s 2025-12-04 02:43:32.199 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 5m 53.519s 2025-12-04 02:43:32.200 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 5m 54.290s 2025-12-04 02:43:32.971 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 771ms
node4 5m 54.299s 2025-12-04 02:43:32.980 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 5m 54.302s 2025-12-04 02:43:32.983 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 54.339s 2025-12-04 02:43:33.020 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 5m 54.405s 2025-12-04 02:43:33.086 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 5m 54.406s 2025-12-04 02:43:33.087 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 5m 56.448s 2025-12-04 02:43:35.129 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 5m 56.526s 2025-12-04 02:43:35.207 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 56.532s 2025-12-04 02:43:35.213 21 INFO STARTUP <main> StartupStateUtils: The following saved states were found on disk:
- /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/190/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/99/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/6/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1/SignedState.swh
node4 5m 56.533s 2025-12-04 02:43:35.214 22 INFO STARTUP <main> StartupStateUtils: Loading latest state from disk.
node4 5m 56.533s 2025-12-04 02:43:35.214 23 INFO STARTUP <main> StartupStateUtils: Loading signed state from disk: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/190/SignedState.swh
node4 5m 56.537s 2025-12-04 02:43:35.218 24 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 56.541s 2025-12-04 02:43:35.222 25 INFO STATE_TO_DISK <main> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp
node4 5m 56.666s 2025-12-04 02:43:35.347 36 INFO STARTUP <main> StartupStateUtils: Loaded state's hash is the same as when it was saved.
node4 5m 56.669s 2025-12-04 02:43:35.350 37 INFO STARTUP <main> StartupStateUtils: Platform has loaded a saved state {"round":190,"consensusTimestamp":"2025-12-04T02:40:00.048504Z"} [com.swirlds.logging.legacy.payload.SavedStateLoadedPayload]
node4 5m 56.672s 2025-12-04 02:43:35.353 40 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 56.679s 2025-12-04 02:43:35.360 43 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 5m 56.681s 2025-12-04 02:43:35.362 44 INFO STARTUP <main> AddressBookInitializer: Using the loaded state's address book and weight values.
node4 5m 56.688s 2025-12-04 02:43:35.369 45 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 56.689s 2025-12-04 02:43:35.370 46 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 57.721s 2025-12-04 02:43:36.402 47 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26777660] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=240180, randomLong=1754426785133214506, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=6680, randomLong=4132961142003526611, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1127400, data=35, exception=null] OS Health Check Report - Complete (took 1019 ms)
node4 5m 57.748s 2025-12-04 02:43:36.429 48 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 5m 57.831s 2025-12-04 02:43:36.512 49 INFO STARTUP <main> PcesUtilities: Span compaction completed for data/saved/preconsensus-events/4/2025/12/04/2025-12-04T02+37+54.823990167Z_seq0_minr1_maxr501_orgn0.pces, new upper bound is 271
node4 5m 57.833s 2025-12-04 02:43:36.514 50 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 5m 57.836s 2025-12-04 02:43:36.517 51 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 5m 57.904s 2025-12-04 02:43:36.585 52 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ioh9hA==", "port": 30124 }, { "ipAddressV4": "CoAAPQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IqtEug==", "port": 30125 }, { "ipAddressV4": "CoAAPg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHF6nQ==", "port": 30126 }, { "ipAddressV4": "CoAAQA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "gtN3xw==", "port": 30127 }, { "ipAddressV4": "CoAAQQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "aMWy1A==", "port": 30128 }, { "ipAddressV4": "CoAAPw==", "port": 30128 }] }] }
node4 5m 57.922s 2025-12-04 02:43:36.603 53 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with state long -9177701731925398757.
node4 5m 57.923s 2025-12-04 02:43:36.604 54 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with 190 rounds handled.
node4 5m 57.923s 2025-12-04 02:43:36.604 55 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 5m 57.924s 2025-12-04 02:43:36.605 56 INFO STARTUP <main> TransactionHandlingHistory: Log file found. Parsing previous history
node4 5m 58.682s 2025-12-04 02:43:37.363 57 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 190 Timestamp: 2025-12-04T02:40:00.048504Z Next consensus number: 5101 Legacy running event hash: 15d31a81dc6c4a0b4f0a176a3b87284c688232e2d65ea7eeb8217683ebc23a680278efe85e2859006936397a9a266cee Legacy running event mnemonic: risk-expand-income-door Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1109048935 Root hash: 30537c7ac026a72bf50d2a97774d6bf76957c87c878afba0f4e4691fc56422b84212662a4bd4a0239191e3b180ff1442 (root) ConsistencyTestingToolState / apology-monster-win-boy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 fetch-jealous-near-eager 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -9177701731925398757 /3 inherit-cream-edit-eye 4 StringLeaf 190 /4 derive-person-belt-ancient
node4 5m 58.937s 2025-12-04 02:43:37.618 59 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 15d31a81dc6c4a0b4f0a176a3b87284c688232e2d65ea7eeb8217683ebc23a680278efe85e2859006936397a9a266cee
node4 5m 58.950s 2025-12-04 02:43:37.631 60 INFO STARTUP <platformForkJoinThread-4> Shadowgraph: Shadowgraph starting from expiration threshold 163
node4 5m 58.958s 2025-12-04 02:43:37.639 62 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 5m 58.959s 2025-12-04 02:43:37.640 63 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 5m 58.960s 2025-12-04 02:43:37.641 64 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 5m 58.963s 2025-12-04 02:43:37.644 65 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 5m 58.965s 2025-12-04 02:43:37.646 66 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 5m 58.966s 2025-12-04 02:43:37.647 67 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 5m 58.968s 2025-12-04 02:43:37.649 68 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 163
node4 5m 58.974s 2025-12-04 02:43:37.655 69 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 191.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 5m 59.178s 2025-12-04 02:43:37.859 70 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:4 H:f9ddae5d9d80 BR:188), num remaining: 4
node4 5m 59.179s 2025-12-04 02:43:37.860 71 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:3f17dd412366 BR:188), num remaining: 3
node4 5m 59.180s 2025-12-04 02:43:37.861 72 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:2b64e8bd2faa BR:188), num remaining: 2
node4 5m 59.180s 2025-12-04 02:43:37.861 73 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:4664003a85b5 BR:188), num remaining: 1
node4 5m 59.181s 2025-12-04 02:43:37.862 74 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:992eb574848c BR:188), num remaining: 0
node4 5m 59.624s 2025-12-04 02:43:38.305 731 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 2,948 preconsensus events with max birth round 271. These events contained 7,595 transactions. 80 rounds reached consensus spanning 50.8 seconds of consensus time. The latest round to reach consensus is round 270. Replay took 655.0 milliseconds.
node4 5m 59.625s 2025-12-04 02:43:38.306 732 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 5m 59.627s 2025-12-04 02:43:38.308 733 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 651.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 6.009m 2025-12-04 02:43:39.204 812 INFO PLATFORM_STATUS <platformForkJoinThread-7> StatusStateMachine: Platform spent 895.0 ms in OBSERVING. Now in BEHIND
node4 6.009m 2025-12-04 02:43:39.205 813 INFO RECONNECT <platformForkJoinThread-3> ReconnectController: Starting ReconnectController
node4 6.009m 2025-12-04 02:43:39.205 814 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, stopping gossip
node4 6.009m 2025-12-04 02:43:39.206 815 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, start clearing queues
node4 6.009m 2025-12-04 02:43:39.207 816 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Queues have been cleared
node4 6.009m 2025-12-04 02:43:39.208 817 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: waiting for reconnect connection
node4 6.009m 2025-12-04 02:43:39.208 818 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: acquired reconnect connection
node1 6.013m 2025-12-04 02:43:39.442 6435 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> ReconnectTeacher: Starting reconnect in the role of the sender {"receiving":false,"nodeId":1,"otherNodeId":4,"round":532} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node1 6.013m 2025-12-04 02:43:39.444 6436 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> ReconnectTeacher: The following state will be sent to the learner:
Round: 532 Timestamp: 2025-12-04T02:43:37.244699Z Next consensus number: 11530 Legacy running event hash: 6c8fb63e3234e1c81678c7a22706aef284dfa8cf97f3da67aa76108b16bc4760a585d3ebb74830314da3b3d8812b77c8 Legacy running event mnemonic: music-debate-educate-shove Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1037676857 Root hash: 23e5ec11b9c06640533efcdd8532538181b42563ba4cdd4c05393f2523736db72d7b5bc7a654d11056bcbd4e8542f6c8 (root) ConsistencyTestingToolState / lake-law-siege-siege 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 around-kit-canoe-own 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -5469284889939319145 /3 joke-merit-ramp-ring 4 StringLeaf 532 /4 snack-century-brass-raven
node1 6.013m 2025-12-04 02:43:39.444 6437 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> ReconnectTeacher: Sending signatures from nodes 0, 1, 2 (signing weight = 37500000000/50000000000) for state hash 23e5ec11b9c06640533efcdd8532538181b42563ba4cdd4c05393f2523736db72d7b5bc7a654d11056bcbd4e8542f6c8
node1 6.013m 2025-12-04 02:43:39.444 6438 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> ReconnectTeacher: Starting synchronization in the role of the sender.
node1 6.013m 2025-12-04 02:43:39.450 6439 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node1 6.013m 2025-12-04 02:43:39.459 6440 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@5527c199 start run()
node4 6.014m 2025-12-04 02:43:39.509 819 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Starting reconnect in role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":1,"round":270} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node4 6.014m 2025-12-04 02:43:39.511 820 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Receiving signed state signatures
node4 6.014m 2025-12-04 02:43:39.515 821 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Received signatures from nodes 0, 1, 2
node4 6.014m 2025-12-04 02:43:39.517 822 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls receiveTree()
node4 6.014m 2025-12-04 02:43:39.518 823 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronizing tree
node4 6.014m 2025-12-04 02:43:39.518 824 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 6.014m 2025-12-04 02:43:39.524 825 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@77188aec start run()
node4 6.014m 2025-12-04 02:43:39.531 826 INFO STARTUP <<work group learning-synchronizer: async-input-stream #0>> ConsistencyTestingToolState: New State Constructed.
node1 6.016m 2025-12-04 02:43:39.612 6456 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@5527c199 finish run()
node1 6.016m 2025-12-04 02:43:39.613 6457 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> TeachingSynchronizer: finished sending tree
node1 6.016m 2025-12-04 02:43:39.614 6458 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node1 6.016m 2025-12-04 02:43:39.615 6459 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@61b2053d start run()
node4 6m 1.044s 2025-12-04 02:43:39.725 848 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 6m 1.045s 2025-12-04 02:43:39.726 849 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 6m 1.045s 2025-12-04 02:43:39.726 850 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@77188aec finish run()
node4 6m 1.046s 2025-12-04 02:43:39.727 851 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 6m 1.046s 2025-12-04 02:43:39.727 852 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 6m 1.049s 2025-12-04 02:43:39.730 853 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@514ff8db start run()
node4 6m 1.112s 2025-12-04 02:43:39.793 854 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): firstLeafPath: 1 -> 1, lastLeafPath: 1 -> 1
node4 6m 1.112s 2025-12-04 02:43:39.793 855 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): done
node4 6m 1.114s 2025-12-04 02:43:39.795 856 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 6m 1.115s 2025-12-04 02:43:39.796 857 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call nodeRemover.allNodesReceived()
node4 6m 1.115s 2025-12-04 02:43:39.796 858 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived()
node4 6m 1.115s 2025-12-04 02:43:39.796 859 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived(): done
node4 6m 1.115s 2025-12-04 02:43:39.796 860 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call root.endLearnerReconnect()
node4 6m 1.116s 2025-12-04 02:43:39.797 861 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call reconnectIterator.close()
node4 6m 1.116s 2025-12-04 02:43:39.797 862 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call setHashPrivate()
node1 6m 1.185s 2025-12-04 02:43:39.866 6460 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@61b2053d finish run()
node1 6m 1.185s 2025-12-04 02:43:39.866 6461 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> TeachingSynchronizer: finished sending tree
node1 6m 1.188s 2025-12-04 02:43:39.869 6464 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> ReconnectTeacher: Finished synchronization in the role of the sender.
node4 6m 1.275s 2025-12-04 02:43:39.956 872 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call postInit()
node4 6m 1.275s 2025-12-04 02:43:39.956 874 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: endLearnerReconnect() complete
node4 6m 1.276s 2025-12-04 02:43:39.957 875 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: close() complete
node4 6m 1.276s 2025-12-04 02:43:39.957 876 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 6m 1.277s 2025-12-04 02:43:39.958 877 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@514ff8db finish run()
node4 6m 1.277s 2025-12-04 02:43:39.958 878 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 6m 1.278s 2025-12-04 02:43:39.959 879 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronization complete
node4 6m 1.278s 2025-12-04 02:43:39.959 880 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls initialize()
node4 6m 1.278s 2025-12-04 02:43:39.959 881 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initializing tree
node4 6m 1.278s 2025-12-04 02:43:39.959 882 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initialization complete
node4 6m 1.278s 2025-12-04 02:43:39.959 883 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls hash()
node4 6m 1.279s 2025-12-04 02:43:39.960 884 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing tree
node4 6m 1.279s 2025-12-04 02:43:39.960 885 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing complete
node4 6m 1.280s 2025-12-04 02:43:39.961 886 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls logStatistics()
node4 6m 1.283s 2025-12-04 02:43:39.964 887 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: Finished synchronization {"timeInSeconds":0.441,"hashTimeInSeconds":0.0,"initializationTimeInSeconds":0.0,"totalNodes":12,"leafNodes":7,"redundantLeafNodes":4,"internalNodes":5,"redundantInternalNodes":2} [com.swirlds.logging.legacy.payload.SynchronizationCompletePayload]
node4 6m 1.283s 2025-12-04 02:43:39.964 888 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: ReconnectMapMetrics: transfersFromTeacher=12; transfersFromLearner=10; internalHashes=0; internalCleanHashes=0; internalData=0; internalCleanData=0; leafHashes=3; leafCleanHashes=3; leafData=7; leafCleanData=4
node4 6m 1.284s 2025-12-04 02:43:39.965 889 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner is done synchronizing
node4 6m 1.287s 2025-12-04 02:43:39.968 890 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Reconnect data usage report {"dataMegabytes":0.006054878234863281} [com.swirlds.logging.legacy.payload.ReconnectDataUsagePayload]
node4 6m 1.291s 2025-12-04 02:43:39.972 891 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Finished reconnect in the role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":1,"round":532,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6m 1.291s 2025-12-04 02:43:39.972 892 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Information for state received during reconnect:
Round: 532 Timestamp: 2025-12-04T02:43:37.244699Z Next consensus number: 11530 Legacy running event hash: 6c8fb63e3234e1c81678c7a22706aef284dfa8cf97f3da67aa76108b16bc4760a585d3ebb74830314da3b3d8812b77c8 Legacy running event mnemonic: music-debate-educate-shove Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1037676857 Root hash: 23e5ec11b9c06640533efcdd8532538181b42563ba4cdd4c05393f2523736db72d7b5bc7a654d11056bcbd4e8542f6c8 (root) ConsistencyTestingToolState / lake-law-siege-siege 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 around-kit-canoe-own 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -5469284889939319145 /3 joke-merit-ramp-ring 4 StringLeaf 532 /4 snack-century-brass-raven
node4 6m 1.292s 2025-12-04 02:43:39.973 894 DEBUG RECONNECT <<reconnect: reconnect-controller>> ReconnectStateLoader: `loadReconnectState` : reloading state
node4 6m 1.292s 2025-12-04 02:43:39.973 895 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with state long -5469284889939319145.
node4 6m 1.293s 2025-12-04 02:43:39.974 896 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with 532 rounds handled.
node4 6m 1.293s 2025-12-04 02:43:39.974 897 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6m 1.293s 2025-12-04 02:43:39.974 898 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Log file found. Parsing previous history
node4 6m 1.322s 2025-12-04 02:43:40.003 903 INFO STATE_TO_DISK <<reconnect: reconnect-controller>> DefaultSavedStateController: Signed state from round 532 created, will eventually be written to disk, for reason: RECONNECT
node4 6m 1.322s 2025-12-04 02:43:40.003 904 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 798.0 ms in BEHIND. Now in RECONNECT_COMPLETE
node4 6m 1.324s 2025-12-04 02:43:40.005 907 INFO STARTUP <platformForkJoinThread-8> Shadowgraph: Shadowgraph starting from expiration threshold 505
node4 6m 1.324s 2025-12-04 02:43:40.005 908 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 532 state to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/532
node4 6m 1.326s 2025-12-04 02:43:40.007 909 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/3 for round 532
node4 6m 1.336s 2025-12-04 02:43:40.017 921 INFO EVENT_STREAM <<reconnect: reconnect-controller>> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 6c8fb63e3234e1c81678c7a22706aef284dfa8cf97f3da67aa76108b16bc4760a585d3ebb74830314da3b3d8812b77c8
node4 6m 1.337s 2025-12-04 02:43:40.018 922 INFO STARTUP <platformForkJoinThread-1> PcesFileManager: Due to recent operations on this node, the local preconsensus event stream will have a discontinuity. The last file with the old origin round is 2025-12-04T02+37+54.823990167Z_seq0_minr1_maxr271_orgn0.pces. All future files will have an origin round of 532.
node1 6m 1.360s 2025-12-04 02:43:40.041 6468 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> ReconnectTeacher: Finished reconnect in the role of the sender. {"receiving":false,"nodeId":1,"otherNodeId":4,"round":532,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6m 1.469s 2025-12-04 02:43:40.150 946 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/3 for round 532
node4 6m 1.473s 2025-12-04 02:43:40.154 947 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 532 Timestamp: 2025-12-04T02:43:37.244699Z Next consensus number: 11530 Legacy running event hash: 6c8fb63e3234e1c81678c7a22706aef284dfa8cf97f3da67aa76108b16bc4760a585d3ebb74830314da3b3d8812b77c8 Legacy running event mnemonic: music-debate-educate-shove Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1037676857 Root hash: 23e5ec11b9c06640533efcdd8532538181b42563ba4cdd4c05393f2523736db72d7b5bc7a654d11056bcbd4e8542f6c8 (root) ConsistencyTestingToolState / lake-law-siege-siege 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 around-kit-canoe-own 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -5469284889939319145 /3 joke-merit-ramp-ring 4 StringLeaf 532 /4 snack-century-brass-raven
node4 6m 1.521s 2025-12-04 02:43:40.202 948 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/12/04/2025-12-04T02+37+54.823990167Z_seq0_minr1_maxr271_orgn0.pces
node4 6m 1.521s 2025-12-04 02:43:40.202 949 WARN STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: No preconsensus event files meeting specified criteria found to copy. Lower bound: 505
node4 6m 1.526s 2025-12-04 02:43:40.207 950 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 532 to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/532 {"round":532,"freezeState":false,"reason":"RECONNECT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/532/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 1.530s 2025-12-04 02:43:40.211 951 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 206.0 ms in RECONNECT_COMPLETE. Now in CHECKING
node4 6m 1.969s 2025-12-04 02:43:40.650 952 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 6m 1.972s 2025-12-04 02:43:40.653 953 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 6m 2.421s 2025-12-04 02:43:41.102 954 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:799d31a8ea9b BR:530), num remaining: 3
node4 6m 2.425s 2025-12-04 02:43:41.106 955 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:b7f055123acf BR:530), num remaining: 2
node4 6m 2.426s 2025-12-04 02:43:41.107 956 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:7fd90ad54e4b BR:531), num remaining: 1
node4 6m 2.426s 2025-12-04 02:43:41.107 957 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:6d51e34813e3 BR:531), num remaining: 0
node4 6m 7.460s 2025-12-04 02:43:46.141 1063 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 5.9 s in CHECKING. Now in ACTIVE
node2 6m 22.980s 2025-12-04 02:44:01.661 6798 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 566 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 6m 23.065s 2025-12-04 02:44:01.746 6774 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 566 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 6m 23.111s 2025-12-04 02:44:01.792 6760 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 566 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 6m 23.165s 2025-12-04 02:44:01.846 6811 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 566 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 6m 23.234s 2025-12-04 02:44:01.915 1310 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 566 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 6m 23.440s 2025-12-04 02:44:02.121 6814 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 566 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/566
node1 6m 23.441s 2025-12-04 02:44:02.122 6815 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/52 for round 566
node0 6m 23.450s 2025-12-04 02:44:02.131 6777 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 566 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/566
node0 6m 23.450s 2025-12-04 02:44:02.131 6778 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 566
node2 6m 23.492s 2025-12-04 02:44:02.173 6801 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 566 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/566
node2 6m 23.493s 2025-12-04 02:44:02.174 6802 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 566
node3 6m 23.520s 2025-12-04 02:44:02.201 6763 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 566 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/566
node3 6m 23.521s 2025-12-04 02:44:02.202 6764 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 566
node1 6m 23.526s 2025-12-04 02:44:02.207 6848 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/52 for round 566
node1 6m 23.528s 2025-12-04 02:44:02.209 6849 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 566 Timestamp: 2025-12-04T02:44:00.298423553Z Next consensus number: 12206 Legacy running event hash: 98bb76fa4a1897e50d1b6cb10a7f3768a25859c9785cb7ed2699e7c9468ad851dc45ee34da3901ec4d7058e9fbc718f3 Legacy running event mnemonic: trim-tent-drink-address Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -232164128 Root hash: eabf6e014a9132b1e6bf3ff42befdd1cd6bf569588777818968d46b5f62e6186b41b1244d1fad14a21b213dcb07f4763 (root) ConsistencyTestingToolState / popular-scan-fall-file 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 mammal-weekend-expand-bag 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -7633485995492869428 /3 rail-cram-video-nut 4 StringLeaf 566 /4 false-business-seven-strategy
node1 6m 23.535s 2025-12-04 02:44:02.216 6850 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+43+18.653639188Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+37+54.916827754Z_seq0_minr1_maxr501_orgn0.pces
node1 6m 23.535s 2025-12-04 02:44:02.216 6851 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 539 File: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+43+18.653639188Z_seq1_minr474_maxr5474_orgn0.pces
node1 6m 23.536s 2025-12-04 02:44:02.217 6852 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 6m 23.537s 2025-12-04 02:44:02.218 6853 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 6m 23.537s 2025-12-04 02:44:02.218 6854 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 566 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/566 {"round":566,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/566/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 6m 23.539s 2025-12-04 02:44:02.220 6855 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/99
node0 6m 23.548s 2025-12-04 02:44:02.229 6825 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 566
node0 6m 23.550s 2025-12-04 02:44:02.231 6826 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 566 Timestamp: 2025-12-04T02:44:00.298423553Z Next consensus number: 12206 Legacy running event hash: 98bb76fa4a1897e50d1b6cb10a7f3768a25859c9785cb7ed2699e7c9468ad851dc45ee34da3901ec4d7058e9fbc718f3 Legacy running event mnemonic: trim-tent-drink-address Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -232164128 Root hash: eabf6e014a9132b1e6bf3ff42befdd1cd6bf569588777818968d46b5f62e6186b41b1244d1fad14a21b213dcb07f4763 (root) ConsistencyTestingToolState / popular-scan-fall-file 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 mammal-weekend-expand-bag 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -7633485995492869428 /3 rail-cram-video-nut 4 StringLeaf 566 /4 false-business-seven-strategy
node0 6m 23.558s 2025-12-04 02:44:02.239 6827 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+37+54.982574427Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+43+18.713205937Z_seq1_minr474_maxr5474_orgn0.pces
node0 6m 23.560s 2025-12-04 02:44:02.241 6828 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 539 File: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+43+18.713205937Z_seq1_minr474_maxr5474_orgn0.pces
node0 6m 23.560s 2025-12-04 02:44:02.241 6829 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 6m 23.562s 2025-12-04 02:44:02.243 6830 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 6m 23.562s 2025-12-04 02:44:02.243 6831 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 566 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/566 {"round":566,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/566/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 23.562s 2025-12-04 02:44:02.243 1313 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 566 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/566
node0 6m 23.563s 2025-12-04 02:44:02.244 6832 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/99
node4 6m 23.563s 2025-12-04 02:44:02.244 1314 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/10 for round 566
node2 6m 23.577s 2025-12-04 02:44:02.258 6849 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 566
node2 6m 23.579s 2025-12-04 02:44:02.260 6850 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 566 Timestamp: 2025-12-04T02:44:00.298423553Z Next consensus number: 12206 Legacy running event hash: 98bb76fa4a1897e50d1b6cb10a7f3768a25859c9785cb7ed2699e7c9468ad851dc45ee34da3901ec4d7058e9fbc718f3 Legacy running event mnemonic: trim-tent-drink-address Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -232164128 Root hash: eabf6e014a9132b1e6bf3ff42befdd1cd6bf569588777818968d46b5f62e6186b41b1244d1fad14a21b213dcb07f4763 (root) ConsistencyTestingToolState / popular-scan-fall-file 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 mammal-weekend-expand-bag 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -7633485995492869428 /3 rail-cram-video-nut 4 StringLeaf 566 /4 false-business-seven-strategy
node2 6m 23.586s 2025-12-04 02:44:02.267 6851 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+37+55.110908429Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+43+18.698508303Z_seq1_minr474_maxr5474_orgn0.pces
node2 6m 23.588s 2025-12-04 02:44:02.269 6852 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 539 File: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+43+18.698508303Z_seq1_minr474_maxr5474_orgn0.pces
node2 6m 23.588s 2025-12-04 02:44:02.269 6853 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 6m 23.589s 2025-12-04 02:44:02.270 6854 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 6m 23.590s 2025-12-04 02:44:02.271 6855 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 566 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/566 {"round":566,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/566/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 6m 23.591s 2025-12-04 02:44:02.272 6856 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/99
node3 6m 23.616s 2025-12-04 02:44:02.297 6800 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 566
node3 6m 23.618s 2025-12-04 02:44:02.299 6801 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 566 Timestamp: 2025-12-04T02:44:00.298423553Z Next consensus number: 12206 Legacy running event hash: 98bb76fa4a1897e50d1b6cb10a7f3768a25859c9785cb7ed2699e7c9468ad851dc45ee34da3901ec4d7058e9fbc718f3 Legacy running event mnemonic: trim-tent-drink-address Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -232164128 Root hash: eabf6e014a9132b1e6bf3ff42befdd1cd6bf569588777818968d46b5f62e6186b41b1244d1fad14a21b213dcb07f4763 (root) ConsistencyTestingToolState / popular-scan-fall-file 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 mammal-weekend-expand-bag 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -7633485995492869428 /3 rail-cram-video-nut 4 StringLeaf 566 /4 false-business-seven-strategy
node3 6m 23.625s 2025-12-04 02:44:02.306 6802 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+43+18.935508925Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+37+55.067700683Z_seq0_minr1_maxr501_orgn0.pces
node3 6m 23.628s 2025-12-04 02:44:02.309 6803 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 539 File: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+43+18.935508925Z_seq1_minr474_maxr5474_orgn0.pces
node3 6m 23.628s 2025-12-04 02:44:02.309 6804 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 6m 23.630s 2025-12-04 02:44:02.311 6805 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 6m 23.630s 2025-12-04 02:44:02.311 6806 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 566 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/566 {"round":566,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/566/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 6m 23.632s 2025-12-04 02:44:02.313 6807 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/99
node4 6m 23.694s 2025-12-04 02:44:02.375 1358 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/10 for round 566
node4 6m 23.696s 2025-12-04 02:44:02.377 1359 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 566 Timestamp: 2025-12-04T02:44:00.298423553Z Next consensus number: 12206 Legacy running event hash: 98bb76fa4a1897e50d1b6cb10a7f3768a25859c9785cb7ed2699e7c9468ad851dc45ee34da3901ec4d7058e9fbc718f3 Legacy running event mnemonic: trim-tent-drink-address Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -232164128 Root hash: eabf6e014a9132b1e6bf3ff42befdd1cd6bf569588777818968d46b5f62e6186b41b1244d1fad14a21b213dcb07f4763 (root) ConsistencyTestingToolState / popular-scan-fall-file 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 mammal-weekend-expand-bag 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -7633485995492869428 /3 rail-cram-video-nut 4 StringLeaf 566 /4 false-business-seven-strategy
node4 6m 23.704s 2025-12-04 02:44:02.385 1360 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/12/04/2025-12-04T02+43+40.539913941Z_seq1_minr505_maxr1005_orgn532.pces Last file: data/saved/preconsensus-events/4/2025/12/04/2025-12-04T02+37+54.823990167Z_seq0_minr1_maxr271_orgn0.pces
node4 6m 23.704s 2025-12-04 02:44:02.385 1361 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 539 File: data/saved/preconsensus-events/4/2025/12/04/2025-12-04T02+43+40.539913941Z_seq1_minr505_maxr1005_orgn532.pces
node4 6m 23.704s 2025-12-04 02:44:02.385 1362 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 6m 23.706s 2025-12-04 02:44:02.387 1363 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 6m 23.707s 2025-12-04 02:44:02.388 1364 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 566 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/566 {"round":566,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/566/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 23.709s 2025-12-04 02:44:02.390 1365 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1
node3 7m 22.730s 2025-12-04 02:45:01.411 7908 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 667 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 7m 22.795s 2025-12-04 02:45:01.476 7969 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 667 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 7m 22.800s 2025-12-04 02:45:01.481 7954 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 667 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 7m 22.996s 2025-12-04 02:45:01.677 2459 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 667 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 7m 23.143s 2025-12-04 02:45:01.824 7944 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 667 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 7m 23.275s 2025-12-04 02:45:01.956 7947 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 667 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/667
node0 7m 23.276s 2025-12-04 02:45:01.957 7948 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 667
node3 7m 23.279s 2025-12-04 02:45:01.960 7921 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 667 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/667
node3 7m 23.280s 2025-12-04 02:45:01.961 7922 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 667
node1 7m 23.350s 2025-12-04 02:45:02.031 7972 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 667 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/667
node1 7m 23.350s 2025-12-04 02:45:02.031 7973 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/59 for round 667
node3 7m 23.370s 2025-12-04 02:45:02.051 7953 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 667
node3 7m 23.372s 2025-12-04 02:45:02.053 7954 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 667 Timestamp: 2025-12-04T02:45:00.208907183Z Next consensus number: 14602 Legacy running event hash: 596216e2122e0ee0008c8add3063abb5358204e01164a7f3ca9a82beadc2279effbbb15a58e6f6d224fa39084b288431 Legacy running event mnemonic: clump-sword-easy-swing Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 831645418 Root hash: a77affe470b18fbe26cdea720986af8dbd793f3c5274c24c2a90e816fc1c227d02ff331e6c8b4b654d21891098caf41c (root) ConsistencyTestingToolState / awkward-adjust-bike-analyst 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 attitude-frequent-similar-shadow 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -8293533747159525616 /3 leave-connect-pistol-whale 4 StringLeaf 667 /4 glow-little-about-protect
node0 7m 23.375s 2025-12-04 02:45:02.056 7983 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 667
node0 7m 23.377s 2025-12-04 02:45:02.058 7984 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 667 Timestamp: 2025-12-04T02:45:00.208907183Z Next consensus number: 14602 Legacy running event hash: 596216e2122e0ee0008c8add3063abb5358204e01164a7f3ca9a82beadc2279effbbb15a58e6f6d224fa39084b288431 Legacy running event mnemonic: clump-sword-easy-swing Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 831645418 Root hash: a77affe470b18fbe26cdea720986af8dbd793f3c5274c24c2a90e816fc1c227d02ff331e6c8b4b654d21891098caf41c (root) ConsistencyTestingToolState / awkward-adjust-bike-analyst 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 attitude-frequent-similar-shadow 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -8293533747159525616 /3 leave-connect-pistol-whale 4 StringLeaf 667 /4 glow-little-about-protect
node3 7m 23.380s 2025-12-04 02:45:02.061 7955 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+43+18.935508925Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+37+55.067700683Z_seq0_minr1_maxr501_orgn0.pces
node3 7m 23.380s 2025-12-04 02:45:02.061 7956 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 639 File: data/saved/preconsensus-events/3/2025/12/04/2025-12-04T02+43+18.935508925Z_seq1_minr474_maxr5474_orgn0.pces
node3 7m 23.380s 2025-12-04 02:45:02.061 7957 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 7m 23.384s 2025-12-04 02:45:02.065 7958 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 7m 23.385s 2025-12-04 02:45:02.066 7985 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+37+54.982574427Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+43+18.713205937Z_seq1_minr474_maxr5474_orgn0.pces
node0 7m 23.385s 2025-12-04 02:45:02.066 7986 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 639 File: data/saved/preconsensus-events/0/2025/12/04/2025-12-04T02+43+18.713205937Z_seq1_minr474_maxr5474_orgn0.pces
node3 7m 23.385s 2025-12-04 02:45:02.066 7959 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 667 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/667 {"round":667,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/667/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7m 23.386s 2025-12-04 02:45:02.067 7987 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 7m 23.387s 2025-12-04 02:45:02.068 7960 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/190
node0 7m 23.389s 2025-12-04 02:45:02.070 7988 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 7m 23.389s 2025-12-04 02:45:02.070 7989 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 667 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/667 {"round":667,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/667/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7m 23.391s 2025-12-04 02:45:02.072 7990 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/190
node4 7m 23.398s 2025-12-04 02:45:02.079 2462 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 667 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/667
node4 7m 23.399s 2025-12-04 02:45:02.080 2463 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/17 for round 667
node1 7m 23.442s 2025-12-04 02:45:02.123 8012 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/59 for round 667
node1 7m 23.444s 2025-12-04 02:45:02.125 8013 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 667 Timestamp: 2025-12-04T02:45:00.208907183Z Next consensus number: 14602 Legacy running event hash: 596216e2122e0ee0008c8add3063abb5358204e01164a7f3ca9a82beadc2279effbbb15a58e6f6d224fa39084b288431 Legacy running event mnemonic: clump-sword-easy-swing Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 831645418 Root hash: a77affe470b18fbe26cdea720986af8dbd793f3c5274c24c2a90e816fc1c227d02ff331e6c8b4b654d21891098caf41c (root) ConsistencyTestingToolState / awkward-adjust-bike-analyst 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 attitude-frequent-similar-shadow 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -8293533747159525616 /3 leave-connect-pistol-whale 4 StringLeaf 667 /4 glow-little-about-protect
node1 7m 23.450s 2025-12-04 02:45:02.131 8014 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+43+18.653639188Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+37+54.916827754Z_seq0_minr1_maxr501_orgn0.pces
node1 7m 23.451s 2025-12-04 02:45:02.132 8015 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 639 File: data/saved/preconsensus-events/1/2025/12/04/2025-12-04T02+43+18.653639188Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 23.451s 2025-12-04 02:45:02.132 8016 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 7m 23.454s 2025-12-04 02:45:02.135 8017 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 7m 23.454s 2025-12-04 02:45:02.135 8018 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 667 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/667 {"round":667,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/667/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 7m 23.456s 2025-12-04 02:45:02.137 8019 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/190
node2 7m 23.478s 2025-12-04 02:45:02.159 7957 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 667 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/667
node2 7m 23.478s 2025-12-04 02:45:02.159 7958 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 667
node4 7m 23.527s 2025-12-04 02:45:02.208 2497 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/17 for round 667
node4 7m 23.529s 2025-12-04 02:45:02.210 2498 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 667 Timestamp: 2025-12-04T02:45:00.208907183Z Next consensus number: 14602 Legacy running event hash: 596216e2122e0ee0008c8add3063abb5358204e01164a7f3ca9a82beadc2279effbbb15a58e6f6d224fa39084b288431 Legacy running event mnemonic: clump-sword-easy-swing Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 831645418 Root hash: a77affe470b18fbe26cdea720986af8dbd793f3c5274c24c2a90e816fc1c227d02ff331e6c8b4b654d21891098caf41c (root) ConsistencyTestingToolState / awkward-adjust-bike-analyst 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 attitude-frequent-similar-shadow 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -8293533747159525616 /3 leave-connect-pistol-whale 4 StringLeaf 667 /4 glow-little-about-protect
node4 7m 23.537s 2025-12-04 02:45:02.218 2502 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/12/04/2025-12-04T02+43+40.539913941Z_seq1_minr505_maxr1005_orgn532.pces Last file: data/saved/preconsensus-events/4/2025/12/04/2025-12-04T02+37+54.823990167Z_seq0_minr1_maxr271_orgn0.pces
node4 7m 23.537s 2025-12-04 02:45:02.218 2503 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 639 File: data/saved/preconsensus-events/4/2025/12/04/2025-12-04T02+43+40.539913941Z_seq1_minr505_maxr1005_orgn532.pces
node4 7m 23.537s 2025-12-04 02:45:02.218 2504 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 7m 23.540s 2025-12-04 02:45:02.221 2505 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 7m 23.540s 2025-12-04 02:45:02.221 2506 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 667 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/667 {"round":667,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/667/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 7m 23.541s 2025-12-04 02:45:02.222 2507 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/6
node2 7m 23.570s 2025-12-04 02:45:02.251 8000 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 667
node2 7m 23.572s 2025-12-04 02:45:02.253 8001 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 667 Timestamp: 2025-12-04T02:45:00.208907183Z Next consensus number: 14602 Legacy running event hash: 596216e2122e0ee0008c8add3063abb5358204e01164a7f3ca9a82beadc2279effbbb15a58e6f6d224fa39084b288431 Legacy running event mnemonic: clump-sword-easy-swing Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 831645418 Root hash: a77affe470b18fbe26cdea720986af8dbd793f3c5274c24c2a90e816fc1c227d02ff331e6c8b4b654d21891098caf41c (root) ConsistencyTestingToolState / awkward-adjust-bike-analyst 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 attitude-frequent-similar-shadow 1 SingletonNode RosterService.ROSTER_STATE /1 hard-seed-oil-lemon 2 VirtualMap RosterService.ROSTERS /2 length-paper-where-enjoy 3 StringLeaf -8293533747159525616 /3 leave-connect-pistol-whale 4 StringLeaf 667 /4 glow-little-about-protect
node2 7m 23.578s 2025-12-04 02:45:02.259 8002 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+37+55.110908429Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+43+18.698508303Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 23.578s 2025-12-04 02:45:02.259 8003 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 639 File: data/saved/preconsensus-events/2/2025/12/04/2025-12-04T02+43+18.698508303Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 23.579s 2025-12-04 02:45:02.260 8004 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 7m 23.581s 2025-12-04 02:45:02.262 8005 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 7m 23.582s 2025-12-04 02:45:02.263 8006 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 667 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/667 {"round":667,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/667/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 7m 23.583s 2025-12-04 02:45:02.264 8007 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/190
node1 7m 57.401s 2025-12-04 02:45:36.082 8622 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 1 to 4>> NetworkUtils: Connection broken: 1 -> 4
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.WaitForAcceptReject.transition(WaitForAcceptReject.java:48) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node2 7m 57.401s 2025-12-04 02:45:36.082 8603 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 2 to 4>> NetworkUtils: Connection broken: 2 -> 4
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:45:36.082610249Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:45:36.082610249Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readLong(DataInputStream.java:407) at org.hiero.base.io.streams.AugmentedDataInputStream.readLong(AugmentedDataInputStream.java:186) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.deserializeEventWindow(SyncUtils.java:640) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readTheirTipsAndEventWindow$3(SyncUtils.java:104) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more
node0 7m 57.402s 2025-12-04 02:45:36.083 8585 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 0 to 4>> NetworkUtils: Connection broken: 0 -> 4
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:45:36.082685734Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:45:36.082685734Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more
node2 7m 57.581s 2025-12-04 02:45:36.262 8604 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith3 2 to 3>> NetworkUtils: Connection broken: 2 -> 3
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:45:36.262178326Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:45:36.262178326Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 12 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$sendEventsTheyNeed$8(SyncUtils.java:234) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more
node0 7m 57.582s 2025-12-04 02:45:36.263 8586 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith3 0 to 3>> NetworkUtils: Connection broken: 0 -> 3
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:45:36.262605624Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:45:36.262605624Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readLong(DataInputStream.java:407) at org.hiero.base.io.streams.AugmentedDataInputStream.readLong(AugmentedDataInputStream.java:186) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.deserializeEventWindow(SyncUtils.java:640) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readTheirTipsAndEventWindow$3(SyncUtils.java:104) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more
node1 7m 57.582s 2025-12-04 02:45:36.263 8623 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith3 1 to 3>> NetworkUtils: Connection broken: 1 -> 3
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:45:36.262419395Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-04T02:45:36.262419395Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:180) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readBoolean(DataInputStream.java:255) at org.hiero.base.io.streams.AugmentedDataInputStream.readBoolean(AugmentedDataInputStream.java:137) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readMyTipsTheyHave$7(SyncUtils.java:163) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more