| node0 | 0.000ns | 2025-09-23 05:45:07.982 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node0 | 86.000ms | 2025-09-23 05:45:08.068 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node0 | 102.000ms | 2025-09-23 05:45:08.084 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node0 | 215.000ms | 2025-09-23 05:45:08.197 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [0] are set to run locally | |
| node0 | 222.000ms | 2025-09-23 05:45:08.204 | 5 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | Registering ConsistencyTestingToolState with ConstructableRegistry | |
| node0 | 234.000ms | 2025-09-23 05:45:08.216 | 6 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node1 | 543.000ms | 2025-09-23 05:45:08.525 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node4 | 562.000ms | 2025-09-23 05:45:08.544 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node1 | 633.000ms | 2025-09-23 05:45:08.615 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node0 | 648.000ms | 2025-09-23 05:45:08.630 | 9 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | ConsistencyTestingToolState is registered with ConstructableRegistry | |
| node1 | 648.000ms | 2025-09-23 05:45:08.630 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node0 | 649.000ms | 2025-09-23 05:45:08.631 | 10 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node4 | 651.000ms | 2025-09-23 05:45:08.633 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node4 | 667.000ms | 2025-09-23 05:45:08.649 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node1 | 759.000ms | 2025-09-23 05:45:08.741 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [1] are set to run locally | |
| node1 | 766.000ms | 2025-09-23 05:45:08.748 | 5 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | Registering ConsistencyTestingToolState with ConstructableRegistry | |
| node1 | 777.000ms | 2025-09-23 05:45:08.759 | 6 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node4 | 782.000ms | 2025-09-23 05:45:08.764 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [4] are set to run locally | |
| node4 | 788.000ms | 2025-09-23 05:45:08.770 | 5 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | Registering ConsistencyTestingToolState with ConstructableRegistry | |
| node4 | 800.000ms | 2025-09-23 05:45:08.782 | 6 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node3 | 979.000ms | 2025-09-23 05:45:08.961 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node2 | 1.007s | 2025-09-23 05:45:08.989 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node3 | 1.071s | 2025-09-23 05:45:09.053 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node3 | 1.087s | 2025-09-23 05:45:09.069 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node2 | 1.107s | 2025-09-23 05:45:09.089 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node2 | 1.123s | 2025-09-23 05:45:09.105 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node1 | 1.199s | 2025-09-23 05:45:09.181 | 9 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | ConsistencyTestingToolState is registered with ConstructableRegistry | |
| node1 | 1.200s | 2025-09-23 05:45:09.182 | 10 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node3 | 1.203s | 2025-09-23 05:45:09.185 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [3] are set to run locally | |
| node3 | 1.210s | 2025-09-23 05:45:09.192 | 5 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | Registering ConsistencyTestingToolState with ConstructableRegistry | |
| node4 | 1.215s | 2025-09-23 05:45:09.197 | 9 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | ConsistencyTestingToolState is registered with ConstructableRegistry | |
| node4 | 1.216s | 2025-09-23 05:45:09.198 | 10 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node3 | 1.222s | 2025-09-23 05:45:09.204 | 6 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node2 | 1.247s | 2025-09-23 05:45:09.229 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [2] are set to run locally | |
| node2 | 1.254s | 2025-09-23 05:45:09.236 | 5 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | Registering ConsistencyTestingToolState with ConstructableRegistry | |
| node2 | 1.268s | 2025-09-23 05:45:09.250 | 6 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node0 | 1.474s | 2025-09-23 05:45:09.456 | 11 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 824ms | |
| node0 | 1.487s | 2025-09-23 05:45:09.469 | 12 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node0 | 1.490s | 2025-09-23 05:45:09.472 | 13 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node0 | 1.536s | 2025-09-23 05:45:09.518 | 14 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node0 | 1.596s | 2025-09-23 05:45:09.578 | 15 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node0 | 1.597s | 2025-09-23 05:45:09.579 | 16 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node3 | 1.657s | 2025-09-23 05:45:09.639 | 9 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | ConsistencyTestingToolState is registered with ConstructableRegistry | |
| node3 | 1.658s | 2025-09-23 05:45:09.640 | 10 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node2 | 1.752s | 2025-09-23 05:45:09.734 | 9 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | ConsistencyTestingToolState is registered with ConstructableRegistry | |
| node2 | 1.753s | 2025-09-23 05:45:09.735 | 10 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node1 | 2.113s | 2025-09-23 05:45:10.095 | 11 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 913ms | |
| node1 | 2.121s | 2025-09-23 05:45:10.103 | 12 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node1 | 2.128s | 2025-09-23 05:45:10.110 | 13 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node4 | 2.141s | 2025-09-23 05:45:10.123 | 11 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 925ms | |
| node4 | 2.149s | 2025-09-23 05:45:10.131 | 12 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node4 | 2.153s | 2025-09-23 05:45:10.135 | 13 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node1 | 2.172s | 2025-09-23 05:45:10.154 | 14 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node4 | 2.191s | 2025-09-23 05:45:10.173 | 14 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node1 | 2.231s | 2025-09-23 05:45:10.213 | 15 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node1 | 2.232s | 2025-09-23 05:45:10.214 | 16 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node4 | 2.250s | 2025-09-23 05:45:10.232 | 15 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node4 | 2.251s | 2025-09-23 05:45:10.233 | 16 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node3 | 2.816s | 2025-09-23 05:45:10.798 | 11 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 1157ms | |
| node2 | 2.823s | 2025-09-23 05:45:10.805 | 11 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 1069ms | |
| node3 | 2.824s | 2025-09-23 05:45:10.806 | 12 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node3 | 2.827s | 2025-09-23 05:45:10.809 | 13 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node2 | 2.832s | 2025-09-23 05:45:10.814 | 12 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node2 | 2.836s | 2025-09-23 05:45:10.818 | 13 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node3 | 2.866s | 2025-09-23 05:45:10.848 | 14 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node2 | 2.878s | 2025-09-23 05:45:10.860 | 14 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node3 | 2.928s | 2025-09-23 05:45:10.910 | 15 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node3 | 2.929s | 2025-09-23 05:45:10.911 | 16 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node2 | 2.963s | 2025-09-23 05:45:10.945 | 15 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node2 | 2.965s | 2025-09-23 05:45:10.947 | 16 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node0 | 3.639s | 2025-09-23 05:45:11.621 | 17 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node0 | 3.726s | 2025-09-23 05:45:11.708 | 20 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node0 | 3.728s | 2025-09-23 05:45:11.710 | 21 | INFO | STARTUP | <main> | StartupStateUtils: | No saved states were found on disk. | |
| node0 | 3.729s | 2025-09-23 05:45:11.711 | 22 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node4 | 4.253s | 2025-09-23 05:45:12.235 | 17 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node1 | 4.263s | 2025-09-23 05:45:12.245 | 17 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node1 | 4.347s | 2025-09-23 05:45:12.329 | 20 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 4.347s | 2025-09-23 05:45:12.329 | 20 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node1 | 4.349s | 2025-09-23 05:45:12.331 | 21 | INFO | STARTUP | <main> | StartupStateUtils: | No saved states were found on disk. | |
| node1 | 4.349s | 2025-09-23 05:45:12.331 | 22 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node4 | 4.349s | 2025-09-23 05:45:12.331 | 21 | INFO | STARTUP | <main> | StartupStateUtils: | No saved states were found on disk. | |
| node4 | 4.350s | 2025-09-23 05:45:12.332 | 22 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node0 | 4.489s | 2025-09-23 05:45:12.471 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node0 | 4.492s | 2025-09-23 05:45:12.474 | 32 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node0 | 4.497s | 2025-09-23 05:45:12.479 | 33 | INFO | STARTUP | <main> | AddressBookInitializer: | Starting from genesis: using the config address book. | |
| node0 | 4.507s | 2025-09-23 05:45:12.489 | 34 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node0 | 4.508s | 2025-09-23 05:45:12.490 | 35 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node3 | 5.032s | 2025-09-23 05:45:13.014 | 17 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node2 | 5.041s | 2025-09-23 05:45:13.023 | 17 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node3 | 5.122s | 2025-09-23 05:45:13.104 | 20 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node1 | 5.123s | 2025-09-23 05:45:13.105 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 5.123s | 2025-09-23 05:45:13.105 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node3 | 5.124s | 2025-09-23 05:45:13.106 | 21 | INFO | STARTUP | <main> | StartupStateUtils: | No saved states were found on disk. | |
| node3 | 5.125s | 2025-09-23 05:45:13.107 | 22 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node1 | 5.126s | 2025-09-23 05:45:13.108 | 32 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node4 | 5.126s | 2025-09-23 05:45:13.108 | 32 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node2 | 5.128s | 2025-09-23 05:45:13.110 | 20 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node2 | 5.131s | 2025-09-23 05:45:13.113 | 21 | INFO | STARTUP | <main> | StartupStateUtils: | No saved states were found on disk. | |
| node2 | 5.132s | 2025-09-23 05:45:13.114 | 22 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node4 | 5.132s | 2025-09-23 05:45:13.114 | 33 | INFO | STARTUP | <main> | AddressBookInitializer: | Starting from genesis: using the config address book. | |
| node1 | 5.133s | 2025-09-23 05:45:13.115 | 33 | INFO | STARTUP | <main> | AddressBookInitializer: | Starting from genesis: using the config address book. | |
| node4 | 5.143s | 2025-09-23 05:45:13.125 | 34 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node1 | 5.145s | 2025-09-23 05:45:13.127 | 34 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 5.146s | 2025-09-23 05:45:13.128 | 35 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node1 | 5.147s | 2025-09-23 05:45:13.129 | 35 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node0 | 5.611s | 2025-09-23 05:45:13.593 | 36 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26343211] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=239869, randomLong=5269938777815878957, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10000, randomLong=9205283286322441295, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1117240, data=35, exception=null] OS Health Check Report - Complete (took 1020 ms) | |||||||||
| node0 | 5.639s | 2025-09-23 05:45:13.621 | 37 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node0 | 5.646s | 2025-09-23 05:45:13.628 | 38 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node0 | 5.651s | 2025-09-23 05:45:13.633 | 39 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node0 | 5.727s | 2025-09-23 05:45:13.709 | 40 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I+Fdwg==", "port": 30124 }, { "ipAddressV4": "CoAAdw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Ijtt6w==", "port": 30125 }, { "ipAddressV4": "CoAAdg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjoaMw==", "port": 30126 }, { "ipAddressV4": "CoAAeA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "IobWVg==", "port": 30127 }, { "ipAddressV4": "CoAAdQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I8pw6w==", "port": 30128 }, { "ipAddressV4": "CoAAdA==", "port": 30128 }] }] } | |||||||||
| node0 | 5.748s | 2025-09-23 05:45:13.730 | 41 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/0/ConsistencyTestLog.csv | |
| node0 | 5.749s | 2025-09-23 05:45:13.731 | 42 | INFO | STARTUP | <main> | TransactionHandlingHistory: | No log file found. Starting without any previous history | |
| node0 | 5.762s | 2025-09-23 05:45:13.744 | 43 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 794ba8da60393f48add668639d012e1d61e5b667b1cf40565693315aeedf7a702df9d3390d70811cb0336a38d308466d (root) ConsistencyTestingToolState / city-tooth-already-trouble 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem | |||||||||
| node3 | 5.903s | 2025-09-23 05:45:13.885 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node3 | 5.907s | 2025-09-23 05:45:13.889 | 32 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node3 | 5.913s | 2025-09-23 05:45:13.895 | 33 | INFO | STARTUP | <main> | AddressBookInitializer: | Starting from genesis: using the config address book. | |
| node3 | 5.924s | 2025-09-23 05:45:13.906 | 34 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node3 | 5.925s | 2025-09-23 05:45:13.907 | 35 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node0 | 5.989s | 2025-09-23 05:45:13.971 | 45 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b | |
| node0 | 5.994s | 2025-09-23 05:45:13.976 | 46 | INFO | STARTUP | <platformForkJoinThread-2> | Shadowgraph: | Shadowgraph starting from expiration threshold 1 | |
| node0 | 5.999s | 2025-09-23 05:45:13.981 | 47 | INFO | STARTUP | <<start-node-0>> | ConsistencyTestingToolMain: | init called in Main for node 0. | |
| node0 | 5.999s | 2025-09-23 05:45:13.981 | 48 | INFO | STARTUP | <<start-node-0>> | SwirldsPlatform: | Starting platform 0 | |
| node0 | 6.000s | 2025-09-23 05:45:13.982 | 49 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node0 | 6.004s | 2025-09-23 05:45:13.986 | 50 | INFO | STARTUP | <<start-node-0>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node0 | 6.005s | 2025-09-23 05:45:13.987 | 51 | INFO | STARTUP | <<start-node-0>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node0 | 6.005s | 2025-09-23 05:45:13.987 | 52 | INFO | STARTUP | <<start-node-0>> | InputWireChecks: | All input wires have been bound. | |
| node0 | 6.007s | 2025-09-23 05:45:13.989 | 53 | WARN | STARTUP | <<start-node-0>> | PcesFileTracker: | No preconsensus event files available | |
| node0 | 6.007s | 2025-09-23 05:45:13.989 | 54 | INFO | STARTUP | <<start-node-0>> | SwirldsPlatform: | replaying preconsensus event stream starting at 0 | |
| node0 | 6.009s | 2025-09-23 05:45:13.991 | 55 | INFO | STARTUP | <<start-node-0>> | PcesReplayer: | Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds. | |
| node0 | 6.010s | 2025-09-23 05:45:13.992 | 56 | INFO | STARTUP | <<app: appMain 0>> | ConsistencyTestingToolMain: | run called in Main. | |
| node0 | 6.012s | 2025-09-23 05:45:13.994 | 57 | INFO | PLATFORM_STATUS | <platformForkJoinThread-5> | DefaultStatusStateMachine: | Platform spent 193.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node0 | 6.017s | 2025-09-23 05:45:13.999 | 58 | INFO | PLATFORM_STATUS | <platformForkJoinThread-5> | DefaultStatusStateMachine: | Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node2 | 6.028s | 2025-09-23 05:45:14.010 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node2 | 6.032s | 2025-09-23 05:45:14.014 | 32 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node2 | 6.039s | 2025-09-23 05:45:14.021 | 33 | INFO | STARTUP | <main> | AddressBookInitializer: | Starting from genesis: using the config address book. | |
| node2 | 6.051s | 2025-09-23 05:45:14.033 | 34 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node2 | 6.053s | 2025-09-23 05:45:14.035 | 35 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node1 | 6.255s | 2025-09-23 05:45:14.237 | 36 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26364792] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=274778, randomLong=3334273151564762875, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=26040, randomLong=5514747870207753997, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1359282, data=35, exception=null] OS Health Check Report - Complete (took 1022 ms) | |||||||||
| node4 | 6.269s | 2025-09-23 05:45:14.251 | 36 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26375998] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=262130, randomLong=-2734205488420192239, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=9600, randomLong=-676621092589143748, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1193290, data=35, exception=null] OS Health Check Report - Complete (took 1021 ms) | |||||||||
| node1 | 6.284s | 2025-09-23 05:45:14.266 | 37 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node1 | 6.292s | 2025-09-23 05:45:14.274 | 38 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node1 | 6.297s | 2025-09-23 05:45:14.279 | 39 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node4 | 6.299s | 2025-09-23 05:45:14.281 | 37 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node4 | 6.307s | 2025-09-23 05:45:14.289 | 38 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node4 | 6.313s | 2025-09-23 05:45:14.295 | 39 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node1 | 6.373s | 2025-09-23 05:45:14.355 | 40 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I+Fdwg==", "port": 30124 }, { "ipAddressV4": "CoAAdw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Ijtt6w==", "port": 30125 }, { "ipAddressV4": "CoAAdg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjoaMw==", "port": 30126 }, { "ipAddressV4": "CoAAeA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "IobWVg==", "port": 30127 }, { "ipAddressV4": "CoAAdQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I8pw6w==", "port": 30128 }, { "ipAddressV4": "CoAAdA==", "port": 30128 }] }] } | |||||||||
| node4 | 6.392s | 2025-09-23 05:45:14.374 | 40 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I+Fdwg==", "port": 30124 }, { "ipAddressV4": "CoAAdw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Ijtt6w==", "port": 30125 }, { "ipAddressV4": "CoAAdg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjoaMw==", "port": 30126 }, { "ipAddressV4": "CoAAeA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "IobWVg==", "port": 30127 }, { "ipAddressV4": "CoAAdQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I8pw6w==", "port": 30128 }, { "ipAddressV4": "CoAAdA==", "port": 30128 }] }] } | |||||||||
| node1 | 6.394s | 2025-09-23 05:45:14.376 | 41 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/1/ConsistencyTestLog.csv | |
| node1 | 6.394s | 2025-09-23 05:45:14.376 | 42 | INFO | STARTUP | <main> | TransactionHandlingHistory: | No log file found. Starting without any previous history | |
| node1 | 6.409s | 2025-09-23 05:45:14.391 | 43 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 794ba8da60393f48add668639d012e1d61e5b667b1cf40565693315aeedf7a702df9d3390d70811cb0336a38d308466d (root) ConsistencyTestingToolState / city-tooth-already-trouble 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem | |||||||||
| node4 | 6.412s | 2025-09-23 05:45:14.394 | 41 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv | |
| node4 | 6.412s | 2025-09-23 05:45:14.394 | 42 | INFO | STARTUP | <main> | TransactionHandlingHistory: | No log file found. Starting without any previous history | |
| node4 | 6.426s | 2025-09-23 05:45:14.408 | 43 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 794ba8da60393f48add668639d012e1d61e5b667b1cf40565693315aeedf7a702df9d3390d70811cb0336a38d308466d (root) ConsistencyTestingToolState / city-tooth-already-trouble 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem | |||||||||
| node4 | 6.599s | 2025-09-23 05:45:14.581 | 45 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b | |
| node4 | 6.603s | 2025-09-23 05:45:14.585 | 46 | INFO | STARTUP | <platformForkJoinThread-2> | Shadowgraph: | Shadowgraph starting from expiration threshold 1 | |
| node4 | 6.607s | 2025-09-23 05:45:14.589 | 47 | INFO | STARTUP | <<start-node-4>> | ConsistencyTestingToolMain: | init called in Main for node 4. | |
| node4 | 6.608s | 2025-09-23 05:45:14.590 | 48 | INFO | STARTUP | <<start-node-4>> | SwirldsPlatform: | Starting platform 4 | |
| node4 | 6.609s | 2025-09-23 05:45:14.591 | 49 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node4 | 6.613s | 2025-09-23 05:45:14.595 | 50 | INFO | STARTUP | <<start-node-4>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node4 | 6.614s | 2025-09-23 05:45:14.596 | 51 | INFO | STARTUP | <<start-node-4>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node4 | 6.615s | 2025-09-23 05:45:14.597 | 52 | INFO | STARTUP | <<start-node-4>> | InputWireChecks: | All input wires have been bound. | |
| node4 | 6.617s | 2025-09-23 05:45:14.599 | 53 | WARN | STARTUP | <<start-node-4>> | PcesFileTracker: | No preconsensus event files available | |
| node4 | 6.617s | 2025-09-23 05:45:14.599 | 54 | INFO | STARTUP | <<start-node-4>> | SwirldsPlatform: | replaying preconsensus event stream starting at 0 | |
| node4 | 6.619s | 2025-09-23 05:45:14.601 | 55 | INFO | STARTUP | <<start-node-4>> | PcesReplayer: | Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds. | |
| node4 | 6.620s | 2025-09-23 05:45:14.602 | 56 | INFO | STARTUP | <<app: appMain 4>> | ConsistencyTestingToolMain: | run called in Main. | |
| node4 | 6.621s | 2025-09-23 05:45:14.603 | 57 | INFO | PLATFORM_STATUS | <platformForkJoinThread-5> | DefaultStatusStateMachine: | Platform spent 143.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node4 | 6.626s | 2025-09-23 05:45:14.608 | 58 | INFO | PLATFORM_STATUS | <platformForkJoinThread-5> | DefaultStatusStateMachine: | Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node1 | 6.636s | 2025-09-23 05:45:14.618 | 45 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b | |
| node1 | 6.641s | 2025-09-23 05:45:14.623 | 46 | INFO | STARTUP | <platformForkJoinThread-2> | Shadowgraph: | Shadowgraph starting from expiration threshold 1 | |
| node1 | 6.646s | 2025-09-23 05:45:14.628 | 47 | INFO | STARTUP | <<start-node-1>> | ConsistencyTestingToolMain: | init called in Main for node 1. | |
| node1 | 6.646s | 2025-09-23 05:45:14.628 | 48 | INFO | STARTUP | <<start-node-1>> | SwirldsPlatform: | Starting platform 1 | |
| node1 | 6.647s | 2025-09-23 05:45:14.629 | 49 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node1 | 6.651s | 2025-09-23 05:45:14.633 | 50 | INFO | STARTUP | <<start-node-1>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node1 | 6.652s | 2025-09-23 05:45:14.634 | 51 | INFO | STARTUP | <<start-node-1>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node1 | 6.652s | 2025-09-23 05:45:14.634 | 52 | INFO | STARTUP | <<start-node-1>> | InputWireChecks: | All input wires have been bound. | |
| node1 | 6.654s | 2025-09-23 05:45:14.636 | 53 | WARN | STARTUP | <<start-node-1>> | PcesFileTracker: | No preconsensus event files available | |
| node1 | 6.654s | 2025-09-23 05:45:14.636 | 54 | INFO | STARTUP | <<start-node-1>> | SwirldsPlatform: | replaying preconsensus event stream starting at 0 | |
| node1 | 6.655s | 2025-09-23 05:45:14.637 | 55 | INFO | STARTUP | <<start-node-1>> | PcesReplayer: | Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds. | |
| node1 | 6.656s | 2025-09-23 05:45:14.638 | 56 | INFO | STARTUP | <<app: appMain 1>> | ConsistencyTestingToolMain: | run called in Main. | |
| node1 | 6.657s | 2025-09-23 05:45:14.639 | 57 | INFO | PLATFORM_STATUS | <platformForkJoinThread-3> | DefaultStatusStateMachine: | Platform spent 191.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node1 | 6.662s | 2025-09-23 05:45:14.644 | 58 | INFO | PLATFORM_STATUS | <platformForkJoinThread-3> | DefaultStatusStateMachine: | Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node3 | 7.036s | 2025-09-23 05:45:15.018 | 36 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26230775] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=158140, randomLong=474228438656086729, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10151, randomLong=-3777970255766792612, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1162179, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms) | |||||||||
| node3 | 7.069s | 2025-09-23 05:45:15.051 | 37 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node3 | 7.077s | 2025-09-23 05:45:15.059 | 38 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node3 | 7.083s | 2025-09-23 05:45:15.065 | 39 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node3 | 7.168s | 2025-09-23 05:45:15.150 | 40 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I+Fdwg==", "port": 30124 }, { "ipAddressV4": "CoAAdw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Ijtt6w==", "port": 30125 }, { "ipAddressV4": "CoAAdg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjoaMw==", "port": 30126 }, { "ipAddressV4": "CoAAeA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "IobWVg==", "port": 30127 }, { "ipAddressV4": "CoAAdQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I8pw6w==", "port": 30128 }, { "ipAddressV4": "CoAAdA==", "port": 30128 }] }] } | |||||||||
| node2 | 7.172s | 2025-09-23 05:45:15.154 | 36 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26186026] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=302700, randomLong=7758237998098815608, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=87200, randomLong=2058097255058432240, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=2174540, data=35, exception=null] OS Health Check Report - Complete (took 1029 ms) | |||||||||
| node3 | 7.191s | 2025-09-23 05:45:15.173 | 41 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/3/ConsistencyTestLog.csv | |
| node3 | 7.192s | 2025-09-23 05:45:15.174 | 42 | INFO | STARTUP | <main> | TransactionHandlingHistory: | No log file found. Starting without any previous history | |
| node3 | 7.208s | 2025-09-23 05:45:15.190 | 43 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 794ba8da60393f48add668639d012e1d61e5b667b1cf40565693315aeedf7a702df9d3390d70811cb0336a38d308466d (root) ConsistencyTestingToolState / city-tooth-already-trouble 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem | |||||||||
| node2 | 7.210s | 2025-09-23 05:45:15.192 | 37 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node2 | 7.220s | 2025-09-23 05:45:15.202 | 38 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node2 | 7.226s | 2025-09-23 05:45:15.208 | 39 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node2 | 7.318s | 2025-09-23 05:45:15.300 | 40 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I+Fdwg==", "port": 30124 }, { "ipAddressV4": "CoAAdw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Ijtt6w==", "port": 30125 }, { "ipAddressV4": "CoAAdg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjoaMw==", "port": 30126 }, { "ipAddressV4": "CoAAeA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "IobWVg==", "port": 30127 }, { "ipAddressV4": "CoAAdQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I8pw6w==", "port": 30128 }, { "ipAddressV4": "CoAAdA==", "port": 30128 }] }] } | |||||||||
| node2 | 7.344s | 2025-09-23 05:45:15.326 | 41 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/2/ConsistencyTestLog.csv | |
| node2 | 7.345s | 2025-09-23 05:45:15.327 | 42 | INFO | STARTUP | <main> | TransactionHandlingHistory: | No log file found. Starting without any previous history | |
| node2 | 7.364s | 2025-09-23 05:45:15.346 | 43 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 794ba8da60393f48add668639d012e1d61e5b667b1cf40565693315aeedf7a702df9d3390d70811cb0336a38d308466d (root) ConsistencyTestingToolState / city-tooth-already-trouble 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem | |||||||||
| node3 | 7.415s | 2025-09-23 05:45:15.397 | 45 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b | |
| node3 | 7.420s | 2025-09-23 05:45:15.402 | 46 | INFO | STARTUP | <platformForkJoinThread-2> | Shadowgraph: | Shadowgraph starting from expiration threshold 1 | |
| node3 | 7.425s | 2025-09-23 05:45:15.407 | 47 | INFO | STARTUP | <<start-node-3>> | ConsistencyTestingToolMain: | init called in Main for node 3. | |
| node3 | 7.426s | 2025-09-23 05:45:15.408 | 48 | INFO | STARTUP | <<start-node-3>> | SwirldsPlatform: | Starting platform 3 | |
| node3 | 7.427s | 2025-09-23 05:45:15.409 | 49 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node3 | 7.431s | 2025-09-23 05:45:15.413 | 50 | INFO | STARTUP | <<start-node-3>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node3 | 7.432s | 2025-09-23 05:45:15.414 | 51 | INFO | STARTUP | <<start-node-3>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node3 | 7.433s | 2025-09-23 05:45:15.415 | 52 | INFO | STARTUP | <<start-node-3>> | InputWireChecks: | All input wires have been bound. | |
| node3 | 7.434s | 2025-09-23 05:45:15.416 | 53 | WARN | STARTUP | <<start-node-3>> | PcesFileTracker: | No preconsensus event files available | |
| node3 | 7.435s | 2025-09-23 05:45:15.417 | 54 | INFO | STARTUP | <<start-node-3>> | SwirldsPlatform: | replaying preconsensus event stream starting at 0 | |
| node3 | 7.436s | 2025-09-23 05:45:15.418 | 55 | INFO | STARTUP | <<start-node-3>> | PcesReplayer: | Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds. | |
| node3 | 7.438s | 2025-09-23 05:45:15.420 | 56 | INFO | STARTUP | <<app: appMain 3>> | ConsistencyTestingToolMain: | run called in Main. | |
| node3 | 7.439s | 2025-09-23 05:45:15.421 | 57 | INFO | PLATFORM_STATUS | <platformForkJoinThread-3> | DefaultStatusStateMachine: | Platform spent 174.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node3 | 7.444s | 2025-09-23 05:45:15.426 | 58 | INFO | PLATFORM_STATUS | <platformForkJoinThread-3> | DefaultStatusStateMachine: | Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node2 | 7.572s | 2025-09-23 05:45:15.554 | 45 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b | |
| node2 | 7.578s | 2025-09-23 05:45:15.560 | 46 | INFO | STARTUP | <platformForkJoinThread-2> | Shadowgraph: | Shadowgraph starting from expiration threshold 1 | |
| node2 | 7.584s | 2025-09-23 05:45:15.566 | 47 | INFO | STARTUP | <<start-node-2>> | ConsistencyTestingToolMain: | init called in Main for node 2. | |
| node2 | 7.585s | 2025-09-23 05:45:15.567 | 48 | INFO | STARTUP | <<start-node-2>> | SwirldsPlatform: | Starting platform 2 | |
| node2 | 7.586s | 2025-09-23 05:45:15.568 | 49 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node2 | 7.590s | 2025-09-23 05:45:15.572 | 50 | INFO | STARTUP | <<start-node-2>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node2 | 7.591s | 2025-09-23 05:45:15.573 | 51 | INFO | STARTUP | <<start-node-2>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node2 | 7.591s | 2025-09-23 05:45:15.573 | 52 | INFO | STARTUP | <<start-node-2>> | InputWireChecks: | All input wires have been bound. | |
| node2 | 7.593s | 2025-09-23 05:45:15.575 | 53 | WARN | STARTUP | <<start-node-2>> | PcesFileTracker: | No preconsensus event files available | |
| node2 | 7.593s | 2025-09-23 05:45:15.575 | 54 | INFO | STARTUP | <<start-node-2>> | SwirldsPlatform: | replaying preconsensus event stream starting at 0 | |
| node2 | 7.595s | 2025-09-23 05:45:15.577 | 55 | INFO | STARTUP | <<start-node-2>> | PcesReplayer: | Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds. | |
| node2 | 7.596s | 2025-09-23 05:45:15.578 | 56 | INFO | STARTUP | <<app: appMain 2>> | ConsistencyTestingToolMain: | run called in Main. | |
| node2 | 7.598s | 2025-09-23 05:45:15.580 | 57 | INFO | PLATFORM_STATUS | <platformForkJoinThread-6> | DefaultStatusStateMachine: | Platform spent 175.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node2 | 7.602s | 2025-09-23 05:45:15.584 | 58 | INFO | PLATFORM_STATUS | <platformForkJoinThread-6> | DefaultStatusStateMachine: | Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node0 | 9.009s | 2025-09-23 05:45:16.991 | 59 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting0.csv' ] | |
| node0 | 9.010s | 2025-09-23 05:45:16.992 | 60 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node4 | 9.619s | 2025-09-23 05:45:17.601 | 59 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ] | |
| node4 | 9.621s | 2025-09-23 05:45:17.603 | 60 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node1 | 9.659s | 2025-09-23 05:45:17.641 | 59 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting1.csv' ] | |
| node1 | 9.662s | 2025-09-23 05:45:17.644 | 60 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node3 | 10.436s | 2025-09-23 05:45:18.418 | 59 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting3.csv' ] | |
| node3 | 10.438s | 2025-09-23 05:45:18.420 | 60 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node2 | 10.598s | 2025-09-23 05:45:18.580 | 59 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting2.csv' ] | |
| node2 | 10.601s | 2025-09-23 05:45:18.583 | 60 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node0 | 16.107s | 2025-09-23 05:45:24.089 | 61 | INFO | PLATFORM_STATUS | <platformForkJoinThread-1> | DefaultStatusStateMachine: | Platform spent 10.1 s in OBSERVING. Now in CHECKING | |
| node4 | 16.716s | 2025-09-23 05:45:24.698 | 61 | INFO | PLATFORM_STATUS | <platformForkJoinThread-2> | DefaultStatusStateMachine: | Platform spent 10.1 s in OBSERVING. Now in CHECKING | |
| node1 | 16.753s | 2025-09-23 05:45:24.735 | 61 | INFO | PLATFORM_STATUS | <platformForkJoinThread-6> | DefaultStatusStateMachine: | Platform spent 10.1 s in OBSERVING. Now in CHECKING | |
| node3 | 17.535s | 2025-09-23 05:45:25.517 | 61 | INFO | PLATFORM_STATUS | <platformForkJoinThread-6> | DefaultStatusStateMachine: | Platform spent 10.1 s in OBSERVING. Now in CHECKING | |
| node2 | 17.693s | 2025-09-23 05:45:25.675 | 61 | INFO | PLATFORM_STATUS | <platformForkJoinThread-3> | DefaultStatusStateMachine: | Platform spent 10.1 s in OBSERVING. Now in CHECKING | |
| node2 | 18.380s | 2025-09-23 05:45:26.362 | 63 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS | |
| node0 | 18.451s | 2025-09-23 05:45:26.433 | 62 | INFO | PLATFORM_STATUS | <platformForkJoinThread-5> | DefaultStatusStateMachine: | Platform spent 2.3 s in CHECKING. Now in ACTIVE | |
| node0 | 18.454s | 2025-09-23 05:45:26.436 | 64 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS | |
| node1 | 18.580s | 2025-09-23 05:45:26.562 | 63 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS | |
| node4 | 18.582s | 2025-09-23 05:45:26.564 | 63 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS | |
| node3 | 18.585s | 2025-09-23 05:45:26.567 | 63 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS | |
| node2 | 18.856s | 2025-09-23 05:45:26.838 | 78 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1 | |
| node2 | 18.858s | 2025-09-23 05:45:26.840 | 79 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1 | |
| node3 | 18.927s | 2025-09-23 05:45:26.909 | 78 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1 | |
| node3 | 18.929s | 2025-09-23 05:45:26.911 | 79 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1 | |
| node0 | 19.000s | 2025-09-23 05:45:26.982 | 79 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1 | |
| node0 | 19.002s | 2025-09-23 05:45:26.984 | 80 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1 | |
| node1 | 19.059s | 2025-09-23 05:45:27.041 | 78 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1 | |
| node1 | 19.061s | 2025-09-23 05:45:27.043 | 79 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1 | |
| node2 | 19.124s | 2025-09-23 05:45:27.106 | 109 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1 | |
| node2 | 19.127s | 2025-09-23 05:45:27.109 | 110 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 1 Timestamp: 2025-09-23T05:45:24.745438521Z Next consensus number: 1 Legacy running event hash: c0724fb00f1fb4e19a95bc6130c3dc629eb3d0e8dbde546ea21d3ec57a9423b40b1233be94cd3fa433e8455f4c17a7be Legacy running event mnemonic: vault-voyage-token-glue Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 61f3e00ce8877b76c83d9b7f0396a648dfc550814ab0e52300eaecbbbc8a82a5cab042c9e1401a12562dd54817998239 (root) ConsistencyTestingToolState / dinosaur-write-upset-human 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 tackle-speed-farm-plug 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom | |||||||||
| node4 | 19.130s | 2025-09-23 05:45:27.112 | 78 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 | |
| node4 | 19.132s | 2025-09-23 05:45:27.114 | 79 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1 | |
| node2 | 19.165s | 2025-09-23 05:45:27.147 | 111 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+45+24.650265559Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 19.165s | 2025-09-23 05:45:27.147 | 112 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+45+24.650265559Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 19.165s | 2025-09-23 05:45:27.147 | 113 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 19.167s | 2025-09-23 05:45:27.149 | 114 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 19.173s | 2025-09-23 05:45:27.155 | 115 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 19.188s | 2025-09-23 05:45:27.170 | 109 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1 | |
| node3 | 19.191s | 2025-09-23 05:45:27.173 | 110 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 1 Timestamp: 2025-09-23T05:45:24.745438521Z Next consensus number: 1 Legacy running event hash: c0724fb00f1fb4e19a95bc6130c3dc629eb3d0e8dbde546ea21d3ec57a9423b40b1233be94cd3fa433e8455f4c17a7be Legacy running event mnemonic: vault-voyage-token-glue Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 61f3e00ce8877b76c83d9b7f0396a648dfc550814ab0e52300eaecbbbc8a82a5cab042c9e1401a12562dd54817998239 (root) ConsistencyTestingToolState / dinosaur-write-upset-human 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 tackle-speed-farm-plug 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom | |||||||||
| node3 | 19.229s | 2025-09-23 05:45:27.211 | 111 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+45+24.442108548Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 19.229s | 2025-09-23 05:45:27.211 | 112 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+45+24.442108548Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 19.230s | 2025-09-23 05:45:27.212 | 113 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 19.231s | 2025-09-23 05:45:27.213 | 114 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 19.237s | 2025-09-23 05:45:27.219 | 115 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 19.242s | 2025-09-23 05:45:27.224 | 110 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1 | |
| node0 | 19.244s | 2025-09-23 05:45:27.226 | 111 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 1 Timestamp: 2025-09-23T05:45:24.745438521Z Next consensus number: 1 Legacy running event hash: c0724fb00f1fb4e19a95bc6130c3dc629eb3d0e8dbde546ea21d3ec57a9423b40b1233be94cd3fa433e8455f4c17a7be Legacy running event mnemonic: vault-voyage-token-glue Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 61f3e00ce8877b76c83d9b7f0396a648dfc550814ab0e52300eaecbbbc8a82a5cab042c9e1401a12562dd54817998239 (root) ConsistencyTestingToolState / dinosaur-write-upset-human 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 tackle-speed-farm-plug 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom | |||||||||
| node0 | 19.282s | 2025-09-23 05:45:27.264 | 112 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+45+24.129682821Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 19.283s | 2025-09-23 05:45:27.265 | 113 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+45+24.129682821Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 19.283s | 2025-09-23 05:45:27.265 | 114 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 19.284s | 2025-09-23 05:45:27.266 | 115 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 19.290s | 2025-09-23 05:45:27.272 | 116 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 19.300s | 2025-09-23 05:45:27.282 | 109 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1 | |
| node1 | 19.303s | 2025-09-23 05:45:27.285 | 110 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 1 Timestamp: 2025-09-23T05:45:24.745438521Z Next consensus number: 1 Legacy running event hash: c0724fb00f1fb4e19a95bc6130c3dc629eb3d0e8dbde546ea21d3ec57a9423b40b1233be94cd3fa433e8455f4c17a7be Legacy running event mnemonic: vault-voyage-token-glue Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 61f3e00ce8877b76c83d9b7f0396a648dfc550814ab0e52300eaecbbbc8a82a5cab042c9e1401a12562dd54817998239 (root) ConsistencyTestingToolState / dinosaur-write-upset-human 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 tackle-speed-farm-plug 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom | |||||||||
| node1 | 19.339s | 2025-09-23 05:45:27.321 | 111 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+45+24.577363596Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 19.340s | 2025-09-23 05:45:27.322 | 112 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+45+24.577363596Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 19.340s | 2025-09-23 05:45:27.322 | 113 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 19.341s | 2025-09-23 05:45:27.323 | 114 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 19.347s | 2025-09-23 05:45:27.329 | 115 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 19.399s | 2025-09-23 05:45:27.381 | 109 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1 | |
| node4 | 19.403s | 2025-09-23 05:45:27.385 | 110 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 1 Timestamp: 2025-09-23T05:45:24.745438521Z Next consensus number: 1 Legacy running event hash: c0724fb00f1fb4e19a95bc6130c3dc629eb3d0e8dbde546ea21d3ec57a9423b40b1233be94cd3fa433e8455f4c17a7be Legacy running event mnemonic: vault-voyage-token-glue Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 61f3e00ce8877b76c83d9b7f0396a648dfc550814ab0e52300eaecbbbc8a82a5cab042c9e1401a12562dd54817998239 (root) ConsistencyTestingToolState / dinosaur-write-upset-human 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 tackle-speed-farm-plug 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom | |||||||||
| node4 | 19.445s | 2025-09-23 05:45:27.427 | 111 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/4/2025/09/23/2025-09-23T05+45+24.575421851Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 19.446s | 2025-09-23 05:45:27.428 | 112 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/4/2025/09/23/2025-09-23T05+45+24.575421851Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 19.447s | 2025-09-23 05:45:27.429 | 113 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 19.448s | 2025-09-23 05:45:27.430 | 114 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 19.455s | 2025-09-23 05:45:27.437 | 115 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 20.124s | 2025-09-23 05:45:28.106 | 117 | INFO | PLATFORM_STATUS | <platformForkJoinThread-6> | DefaultStatusStateMachine: | Platform spent 3.4 s in CHECKING. Now in ACTIVE | |
| node3 | 20.158s | 2025-09-23 05:45:28.140 | 117 | INFO | PLATFORM_STATUS | <platformForkJoinThread-4> | DefaultStatusStateMachine: | Platform spent 2.6 s in CHECKING. Now in ACTIVE | |
| node4 | 20.245s | 2025-09-23 05:45:28.227 | 117 | INFO | PLATFORM_STATUS | <platformForkJoinThread-4> | DefaultStatusStateMachine: | Platform spent 3.5 s in CHECKING. Now in ACTIVE | |
| node2 | 20.278s | 2025-09-23 05:45:28.260 | 117 | INFO | PLATFORM_STATUS | <platformForkJoinThread-3> | DefaultStatusStateMachine: | Platform spent 2.6 s in CHECKING. Now in ACTIVE | |
| node0 | 53.502s | 2025-09-23 05:46:01.484 | 711 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 56 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 53.531s | 2025-09-23 05:46:01.513 | 719 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 56 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 53.579s | 2025-09-23 05:46:01.561 | 709 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 56 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node4 | 53.719s | 2025-09-23 05:46:01.701 | 701 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 56 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 53.809s | 2025-09-23 05:46:01.791 | 701 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 56 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 53.912s | 2025-09-23 05:46:01.894 | 704 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 56 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/56 | |
| node2 | 53.913s | 2025-09-23 05:46:01.895 | 705 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 56 | |
| node3 | 53.922s | 2025-09-23 05:46:01.904 | 722 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 56 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/56 | |
| node3 | 53.922s | 2025-09-23 05:46:01.904 | 723 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 56 | |
| node1 | 53.991s | 2025-09-23 05:46:01.973 | 712 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 56 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/56 | |
| node1 | 53.992s | 2025-09-23 05:46:01.974 | 713 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 56 | |
| node2 | 54.006s | 2025-09-23 05:46:01.988 | 740 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 56 | |
| node2 | 54.009s | 2025-09-23 05:46:01.991 | 741 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 56 Timestamp: 2025-09-23T05:46:00.384951Z Next consensus number: 1475 Legacy running event hash: 49ab2829a2dbd816640f938dccf6384c14586e9f18f37c82025a30639235e6de121a271d6c503897b36bce37e76f7b9a Legacy running event mnemonic: cube-ahead-pigeon-wool Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1793282856 Root hash: c713e9778d87e3d3d3051e1c4bde674a1a66e77b032117a703a466ee545491bceb9ff327467acd895c53033c70d919ee (root) ConsistencyTestingToolState / buffalo-spell-execute-mad 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 attract-negative-recycle-alter 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 4370894481584787626 /3 angry-glimpse-security-gun 4 StringLeaf 56 /4 piece-witness-slogan-broom | |||||||||
| node3 | 54.009s | 2025-09-23 05:46:01.991 | 754 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 56 | |
| node3 | 54.011s | 2025-09-23 05:46:01.993 | 755 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 56 Timestamp: 2025-09-23T05:46:00.384951Z Next consensus number: 1475 Legacy running event hash: 49ab2829a2dbd816640f938dccf6384c14586e9f18f37c82025a30639235e6de121a271d6c503897b36bce37e76f7b9a Legacy running event mnemonic: cube-ahead-pigeon-wool Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1793282856 Root hash: c713e9778d87e3d3d3051e1c4bde674a1a66e77b032117a703a466ee545491bceb9ff327467acd895c53033c70d919ee (root) ConsistencyTestingToolState / buffalo-spell-execute-mad 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 attract-negative-recycle-alter 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 4370894481584787626 /3 angry-glimpse-security-gun 4 StringLeaf 56 /4 piece-witness-slogan-broom | |||||||||
| node2 | 54.020s | 2025-09-23 05:46:02.002 | 742 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+45+24.650265559Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 54.020s | 2025-09-23 05:46:02.002 | 743 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 29 File: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+45+24.650265559Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 54.020s | 2025-09-23 05:46:02.002 | 744 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 54.021s | 2025-09-23 05:46:02.003 | 756 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+45+24.442108548Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 54.021s | 2025-09-23 05:46:02.003 | 757 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 29 File: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+45+24.442108548Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 54.021s | 2025-09-23 05:46:02.003 | 758 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 54.022s | 2025-09-23 05:46:02.004 | 745 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 54.022s | 2025-09-23 05:46:02.004 | 746 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 56 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/56 {"round":56,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/56/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 54.022s | 2025-09-23 05:46:02.004 | 759 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 54.023s | 2025-09-23 05:46:02.005 | 760 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 56 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/56 {"round":56,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/56/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 54.031s | 2025-09-23 05:46:02.013 | 704 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 56 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/56 | |
| node4 | 54.032s | 2025-09-23 05:46:02.014 | 705 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 56 | |
| node0 | 54.059s | 2025-09-23 05:46:02.041 | 714 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 56 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/56 | |
| node0 | 54.060s | 2025-09-23 05:46:02.042 | 715 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 56 | |
| node1 | 54.074s | 2025-09-23 05:46:02.056 | 748 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 56 | |
| node1 | 54.076s | 2025-09-23 05:46:02.058 | 749 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 56 Timestamp: 2025-09-23T05:46:00.384951Z Next consensus number: 1475 Legacy running event hash: 49ab2829a2dbd816640f938dccf6384c14586e9f18f37c82025a30639235e6de121a271d6c503897b36bce37e76f7b9a Legacy running event mnemonic: cube-ahead-pigeon-wool Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1793282856 Root hash: c713e9778d87e3d3d3051e1c4bde674a1a66e77b032117a703a466ee545491bceb9ff327467acd895c53033c70d919ee (root) ConsistencyTestingToolState / buffalo-spell-execute-mad 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 attract-negative-recycle-alter 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 4370894481584787626 /3 angry-glimpse-security-gun 4 StringLeaf 56 /4 piece-witness-slogan-broom | |||||||||
| node1 | 54.084s | 2025-09-23 05:46:02.066 | 750 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+45+24.577363596Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 54.084s | 2025-09-23 05:46:02.066 | 751 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 29 File: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+45+24.577363596Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 54.084s | 2025-09-23 05:46:02.066 | 752 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 54.085s | 2025-09-23 05:46:02.067 | 753 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 54.086s | 2025-09-23 05:46:02.068 | 754 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 56 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/56 {"round":56,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/56/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 54.135s | 2025-09-23 05:46:02.117 | 754 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 56 | |
| node0 | 54.137s | 2025-09-23 05:46:02.119 | 755 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 56 Timestamp: 2025-09-23T05:46:00.384951Z Next consensus number: 1475 Legacy running event hash: 49ab2829a2dbd816640f938dccf6384c14586e9f18f37c82025a30639235e6de121a271d6c503897b36bce37e76f7b9a Legacy running event mnemonic: cube-ahead-pigeon-wool Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1793282856 Root hash: c713e9778d87e3d3d3051e1c4bde674a1a66e77b032117a703a466ee545491bceb9ff327467acd895c53033c70d919ee (root) ConsistencyTestingToolState / buffalo-spell-execute-mad 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 attract-negative-recycle-alter 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 4370894481584787626 /3 angry-glimpse-security-gun 4 StringLeaf 56 /4 piece-witness-slogan-broom | |||||||||
| node4 | 54.139s | 2025-09-23 05:46:02.121 | 736 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 56 | |
| node4 | 54.142s | 2025-09-23 05:46:02.124 | 737 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 56 Timestamp: 2025-09-23T05:46:00.384951Z Next consensus number: 1475 Legacy running event hash: 49ab2829a2dbd816640f938dccf6384c14586e9f18f37c82025a30639235e6de121a271d6c503897b36bce37e76f7b9a Legacy running event mnemonic: cube-ahead-pigeon-wool Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1793282856 Root hash: c713e9778d87e3d3d3051e1c4bde674a1a66e77b032117a703a466ee545491bceb9ff327467acd895c53033c70d919ee (root) ConsistencyTestingToolState / buffalo-spell-execute-mad 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 attract-negative-recycle-alter 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 4370894481584787626 /3 angry-glimpse-security-gun 4 StringLeaf 56 /4 piece-witness-slogan-broom | |||||||||
| node0 | 54.146s | 2025-09-23 05:46:02.128 | 756 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+45+24.129682821Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 54.146s | 2025-09-23 05:46:02.128 | 757 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 29 File: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+45+24.129682821Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 54.146s | 2025-09-23 05:46:02.128 | 758 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 54.147s | 2025-09-23 05:46:02.129 | 759 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 54.148s | 2025-09-23 05:46:02.130 | 760 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 56 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/56 {"round":56,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/56/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 54.150s | 2025-09-23 05:46:02.132 | 738 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/4/2025/09/23/2025-09-23T05+45+24.575421851Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 54.150s | 2025-09-23 05:46:02.132 | 739 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 29 File: data/saved/preconsensus-events/4/2025/09/23/2025-09-23T05+45+24.575421851Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 54.150s | 2025-09-23 05:46:02.132 | 740 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 54.151s | 2025-09-23 05:46:02.133 | 741 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 54.152s | 2025-09-23 05:46:02.134 | 742 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 56 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/56 {"round":56,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/56/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 1m 53.601s | 2025-09-23 05:47:01.583 | 1777 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 148 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node0 | 1m 53.612s | 2025-09-23 05:47:01.594 | 1781 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 148 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 1m 53.639s | 2025-09-23 05:47:01.621 | 1757 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 148 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node4 | 1m 53.683s | 2025-09-23 05:47:01.665 | 1781 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 148 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 1m 53.713s | 2025-09-23 05:47:01.695 | 1779 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 148 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 1m 53.916s | 2025-09-23 05:47:01.898 | 1782 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 148 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/148 | |
| node1 | 1m 53.916s | 2025-09-23 05:47:01.898 | 1783 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 148 | |
| node3 | 1m 53.958s | 2025-09-23 05:47:01.940 | 1780 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 148 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/148 | |
| node3 | 1m 53.960s | 2025-09-23 05:47:01.942 | 1781 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 148 | |
| node0 | 1m 53.986s | 2025-09-23 05:47:01.968 | 1784 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 148 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/148 | |
| node0 | 1m 53.987s | 2025-09-23 05:47:01.969 | 1785 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 148 | |
| node1 | 1m 54.008s | 2025-09-23 05:47:01.990 | 1814 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 148 | |
| node1 | 1m 54.010s | 2025-09-23 05:47:01.992 | 1815 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 148 Timestamp: 2025-09-23T05:47:00.027046875Z Next consensus number: 3989 Legacy running event hash: 60dd7dfcb2a84a7d6827c41db966caaef720116905ec7f9d04ffeab8e79b35da8099943cdbeef479b3bbb15875ef424d Legacy running event mnemonic: brick-sauce-prison-exile Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 673253526 Root hash: 413a0e648135aba00a4f492a80108f0b25ee48acca2c02ca48552083b888319c748d8bc5617fe703cb32ccbbb4038a2e (root) ConsistencyTestingToolState / chef-smart-sunny-metal 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wisdom-venue-ripple-route 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 4959534697973523919 /3 camera-wing-million-dinosaur 4 StringLeaf 148 /4 mosquito-crack-camera-bullet | |||||||||
| node4 | 1m 54.013s | 2025-09-23 05:47:01.995 | 1784 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 148 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/148 | |
| node4 | 1m 54.013s | 2025-09-23 05:47:01.995 | 1785 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 148 | |
| node1 | 1m 54.017s | 2025-09-23 05:47:01.999 | 1816 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+45+24.577363596Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 1m 54.017s | 2025-09-23 05:47:01.999 | 1817 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 121 File: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+45+24.577363596Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 1m 54.017s | 2025-09-23 05:47:01.999 | 1818 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 1m 54.020s | 2025-09-23 05:47:02.002 | 1819 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 1m 54.021s | 2025-09-23 05:47:02.003 | 1820 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 148 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/148 {"round":148,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/148/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 1m 54.051s | 2025-09-23 05:47:02.033 | 1760 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 148 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/148 | |
| node2 | 1m 54.052s | 2025-09-23 05:47:02.034 | 1761 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 148 | |
| node3 | 1m 54.053s | 2025-09-23 05:47:02.035 | 1816 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 148 | |
| node3 | 1m 54.055s | 2025-09-23 05:47:02.037 | 1817 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 148 Timestamp: 2025-09-23T05:47:00.027046875Z Next consensus number: 3989 Legacy running event hash: 60dd7dfcb2a84a7d6827c41db966caaef720116905ec7f9d04ffeab8e79b35da8099943cdbeef479b3bbb15875ef424d Legacy running event mnemonic: brick-sauce-prison-exile Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 673253526 Root hash: 413a0e648135aba00a4f492a80108f0b25ee48acca2c02ca48552083b888319c748d8bc5617fe703cb32ccbbb4038a2e (root) ConsistencyTestingToolState / chef-smart-sunny-metal 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wisdom-venue-ripple-route 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 4959534697973523919 /3 camera-wing-million-dinosaur 4 StringLeaf 148 /4 mosquito-crack-camera-bullet | |||||||||
| node3 | 1m 54.063s | 2025-09-23 05:47:02.045 | 1818 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+45+24.442108548Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 1m 54.063s | 2025-09-23 05:47:02.045 | 1819 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 121 File: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+45+24.442108548Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 1m 54.064s | 2025-09-23 05:47:02.046 | 1820 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 1m 54.067s | 2025-09-23 05:47:02.049 | 1821 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 1m 54.067s | 2025-09-23 05:47:02.049 | 1822 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 148 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/148 {"round":148,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/148/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 1m 54.072s | 2025-09-23 05:47:02.054 | 1820 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 148 | |
| node0 | 1m 54.074s | 2025-09-23 05:47:02.056 | 1824 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 148 Timestamp: 2025-09-23T05:47:00.027046875Z Next consensus number: 3989 Legacy running event hash: 60dd7dfcb2a84a7d6827c41db966caaef720116905ec7f9d04ffeab8e79b35da8099943cdbeef479b3bbb15875ef424d Legacy running event mnemonic: brick-sauce-prison-exile Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 673253526 Root hash: 413a0e648135aba00a4f492a80108f0b25ee48acca2c02ca48552083b888319c748d8bc5617fe703cb32ccbbb4038a2e (root) ConsistencyTestingToolState / chef-smart-sunny-metal 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wisdom-venue-ripple-route 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 4959534697973523919 /3 camera-wing-million-dinosaur 4 StringLeaf 148 /4 mosquito-crack-camera-bullet | |||||||||
| node0 | 1m 54.083s | 2025-09-23 05:47:02.065 | 1825 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+45+24.129682821Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 1m 54.084s | 2025-09-23 05:47:02.066 | 1826 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 121 File: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+45+24.129682821Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 1m 54.084s | 2025-09-23 05:47:02.066 | 1827 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 1m 54.087s | 2025-09-23 05:47:02.069 | 1828 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 1m 54.088s | 2025-09-23 05:47:02.070 | 1829 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 148 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/148 {"round":148,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/148/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 1m 54.105s | 2025-09-23 05:47:02.087 | 1820 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 148 | |
| node4 | 1m 54.107s | 2025-09-23 05:47:02.089 | 1821 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 148 Timestamp: 2025-09-23T05:47:00.027046875Z Next consensus number: 3989 Legacy running event hash: 60dd7dfcb2a84a7d6827c41db966caaef720116905ec7f9d04ffeab8e79b35da8099943cdbeef479b3bbb15875ef424d Legacy running event mnemonic: brick-sauce-prison-exile Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 673253526 Root hash: 413a0e648135aba00a4f492a80108f0b25ee48acca2c02ca48552083b888319c748d8bc5617fe703cb32ccbbb4038a2e (root) ConsistencyTestingToolState / chef-smart-sunny-metal 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wisdom-venue-ripple-route 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 4959534697973523919 /3 camera-wing-million-dinosaur 4 StringLeaf 148 /4 mosquito-crack-camera-bullet | |||||||||
| node4 | 1m 54.116s | 2025-09-23 05:47:02.098 | 1822 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/4/2025/09/23/2025-09-23T05+45+24.575421851Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 1m 54.116s | 2025-09-23 05:47:02.098 | 1823 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 121 File: data/saved/preconsensus-events/4/2025/09/23/2025-09-23T05+45+24.575421851Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 1m 54.116s | 2025-09-23 05:47:02.098 | 1824 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 1m 54.119s | 2025-09-23 05:47:02.101 | 1825 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 1m 54.120s | 2025-09-23 05:47:02.102 | 1826 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 148 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/148 {"round":148,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/148/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 1m 54.198s | 2025-09-23 05:47:02.180 | 1796 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 148 | |
| node2 | 1m 54.201s | 2025-09-23 05:47:02.183 | 1797 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 148 Timestamp: 2025-09-23T05:47:00.027046875Z Next consensus number: 3989 Legacy running event hash: 60dd7dfcb2a84a7d6827c41db966caaef720116905ec7f9d04ffeab8e79b35da8099943cdbeef479b3bbb15875ef424d Legacy running event mnemonic: brick-sauce-prison-exile Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 673253526 Root hash: 413a0e648135aba00a4f492a80108f0b25ee48acca2c02ca48552083b888319c748d8bc5617fe703cb32ccbbb4038a2e (root) ConsistencyTestingToolState / chef-smart-sunny-metal 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 wisdom-venue-ripple-route 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 4959534697973523919 /3 camera-wing-million-dinosaur 4 StringLeaf 148 /4 mosquito-crack-camera-bullet | |||||||||
| node2 | 1m 54.210s | 2025-09-23 05:47:02.192 | 1798 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+45+24.650265559Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 1m 54.210s | 2025-09-23 05:47:02.192 | 1799 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 121 File: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+45+24.650265559Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 1m 54.210s | 2025-09-23 05:47:02.192 | 1800 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 1m 54.214s | 2025-09-23 05:47:02.196 | 1801 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 1m 54.214s | 2025-09-23 05:47:02.196 | 1802 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 148 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/148 {"round":148,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/148/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 2m 53.341s | 2025-09-23 05:48:01.323 | 2823 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 238 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node4 | 2m 53.477s | 2025-09-23 05:48:01.459 | 2829 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 238 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 2m 53.610s | 2025-09-23 05:48:01.592 | 2825 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 238 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 2m 53.674s | 2025-09-23 05:48:01.656 | 2839 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 238 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 2m 53.678s | 2025-09-23 05:48:01.660 | 2847 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 238 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 2m 53.899s | 2025-09-23 05:48:01.881 | 2828 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 238 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/238 | |
| node3 | 2m 53.900s | 2025-09-23 05:48:01.882 | 2829 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 238 | |
| node1 | 2m 53.969s | 2025-09-23 05:48:01.951 | 2850 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 238 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/238 | |
| node1 | 2m 53.970s | 2025-09-23 05:48:01.952 | 2851 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 238 | |
| node3 | 2m 53.996s | 2025-09-23 05:48:01.978 | 2860 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 238 | |
| node3 | 2m 53.999s | 2025-09-23 05:48:01.981 | 2861 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 238 Timestamp: 2025-09-23T05:48:00.057986224Z Next consensus number: 6535 Legacy running event hash: 7586645a270846ebf9c8aa77f4b127f207cb29dd6dc87d32578564d294cd7a2f7e370d8fc336f77b6a309e18d0fa3164 Legacy running event mnemonic: radio-mercy-three-talent Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1594774619 Root hash: 6437ac6f182af9cfad7ea514d636f7554b018fa6fec8eb117e308328e66dc1c9a7b5bb235f80d5416a0fb82a76ce71dd (root) ConsistencyTestingToolState / manage-impulse-aim-shiver 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 position-swarm-music-end 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 2930039329747737888 /3 leisure-recycle-vendor-solve 4 StringLeaf 238 /4 degree-garment-fashion-empower | |||||||||
| node3 | 2m 54.006s | 2025-09-23 05:48:01.988 | 2862 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+45+24.442108548Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 2m 54.006s | 2025-09-23 05:48:01.988 | 2863 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 211 File: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+45+24.442108548Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 2m 54.006s | 2025-09-23 05:48:01.988 | 2864 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 2m 54.012s | 2025-09-23 05:48:01.994 | 2865 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 2m 54.012s | 2025-09-23 05:48:01.994 | 2866 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 238 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/238 {"round":238,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/238/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 2m 54.026s | 2025-09-23 05:48:02.008 | 2836 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 238 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/238 | |
| node0 | 2m 54.027s | 2025-09-23 05:48:02.009 | 2837 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 238 | |
| node4 | 2m 54.038s | 2025-09-23 05:48:02.020 | 2842 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 238 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/238 | |
| node4 | 2m 54.039s | 2025-09-23 05:48:02.021 | 2843 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 238 | |
| node1 | 2m 54.054s | 2025-09-23 05:48:02.036 | 2886 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 238 | |
| node1 | 2m 54.056s | 2025-09-23 05:48:02.038 | 2887 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 238 Timestamp: 2025-09-23T05:48:00.057986224Z Next consensus number: 6535 Legacy running event hash: 7586645a270846ebf9c8aa77f4b127f207cb29dd6dc87d32578564d294cd7a2f7e370d8fc336f77b6a309e18d0fa3164 Legacy running event mnemonic: radio-mercy-three-talent Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1594774619 Root hash: 6437ac6f182af9cfad7ea514d636f7554b018fa6fec8eb117e308328e66dc1c9a7b5bb235f80d5416a0fb82a76ce71dd (root) ConsistencyTestingToolState / manage-impulse-aim-shiver 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 position-swarm-music-end 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 2930039329747737888 /3 leisure-recycle-vendor-solve 4 StringLeaf 238 /4 degree-garment-fashion-empower | |||||||||
| node1 | 2m 54.064s | 2025-09-23 05:48:02.046 | 2888 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+45+24.577363596Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 2m 54.064s | 2025-09-23 05:48:02.046 | 2889 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 211 File: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+45+24.577363596Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 2m 54.065s | 2025-09-23 05:48:02.047 | 2890 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 2m 54.071s | 2025-09-23 05:48:02.053 | 2891 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 2m 54.072s | 2025-09-23 05:48:02.054 | 2892 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 238 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/238 {"round":238,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/238/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 2m 54.092s | 2025-09-23 05:48:02.074 | 2852 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 238 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/238 | |
| node2 | 2m 54.094s | 2025-09-23 05:48:02.076 | 2853 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 238 | |
| node0 | 2m 54.117s | 2025-09-23 05:48:02.099 | 2868 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 238 | |
| node0 | 2m 54.119s | 2025-09-23 05:48:02.101 | 2869 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 238 Timestamp: 2025-09-23T05:48:00.057986224Z Next consensus number: 6535 Legacy running event hash: 7586645a270846ebf9c8aa77f4b127f207cb29dd6dc87d32578564d294cd7a2f7e370d8fc336f77b6a309e18d0fa3164 Legacy running event mnemonic: radio-mercy-three-talent Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1594774619 Root hash: 6437ac6f182af9cfad7ea514d636f7554b018fa6fec8eb117e308328e66dc1c9a7b5bb235f80d5416a0fb82a76ce71dd (root) ConsistencyTestingToolState / manage-impulse-aim-shiver 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 position-swarm-music-end 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 2930039329747737888 /3 leisure-recycle-vendor-solve 4 StringLeaf 238 /4 degree-garment-fashion-empower | |||||||||
| node0 | 2m 54.126s | 2025-09-23 05:48:02.108 | 2870 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+45+24.129682821Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 2m 54.126s | 2025-09-23 05:48:02.108 | 2871 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 211 File: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+45+24.129682821Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 2m 54.126s | 2025-09-23 05:48:02.108 | 2872 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 2m 54.131s | 2025-09-23 05:48:02.113 | 2873 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 2m 54.132s | 2025-09-23 05:48:02.114 | 2874 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 238 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/238 {"round":238,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/238/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 2m 54.134s | 2025-09-23 05:48:02.116 | 2874 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 238 | |
| node4 | 2m 54.136s | 2025-09-23 05:48:02.118 | 2875 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 238 Timestamp: 2025-09-23T05:48:00.057986224Z Next consensus number: 6535 Legacy running event hash: 7586645a270846ebf9c8aa77f4b127f207cb29dd6dc87d32578564d294cd7a2f7e370d8fc336f77b6a309e18d0fa3164 Legacy running event mnemonic: radio-mercy-three-talent Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1594774619 Root hash: 6437ac6f182af9cfad7ea514d636f7554b018fa6fec8eb117e308328e66dc1c9a7b5bb235f80d5416a0fb82a76ce71dd (root) ConsistencyTestingToolState / manage-impulse-aim-shiver 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 position-swarm-music-end 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 2930039329747737888 /3 leisure-recycle-vendor-solve 4 StringLeaf 238 /4 degree-garment-fashion-empower | |||||||||
| node4 | 2m 54.143s | 2025-09-23 05:48:02.125 | 2876 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/4/2025/09/23/2025-09-23T05+45+24.575421851Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 2m 54.143s | 2025-09-23 05:48:02.125 | 2877 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 211 File: data/saved/preconsensus-events/4/2025/09/23/2025-09-23T05+45+24.575421851Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 2m 54.143s | 2025-09-23 05:48:02.125 | 2878 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 2m 54.148s | 2025-09-23 05:48:02.130 | 2879 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 2m 54.149s | 2025-09-23 05:48:02.131 | 2880 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 238 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/238 {"round":238,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/238/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 2m 54.210s | 2025-09-23 05:48:02.192 | 2896 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 238 | |
| node2 | 2m 54.212s | 2025-09-23 05:48:02.194 | 2897 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 238 Timestamp: 2025-09-23T05:48:00.057986224Z Next consensus number: 6535 Legacy running event hash: 7586645a270846ebf9c8aa77f4b127f207cb29dd6dc87d32578564d294cd7a2f7e370d8fc336f77b6a309e18d0fa3164 Legacy running event mnemonic: radio-mercy-three-talent Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1594774619 Root hash: 6437ac6f182af9cfad7ea514d636f7554b018fa6fec8eb117e308328e66dc1c9a7b5bb235f80d5416a0fb82a76ce71dd (root) ConsistencyTestingToolState / manage-impulse-aim-shiver 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 position-swarm-music-end 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 2930039329747737888 /3 leisure-recycle-vendor-solve 4 StringLeaf 238 /4 degree-garment-fashion-empower | |||||||||
| node2 | 2m 54.222s | 2025-09-23 05:48:02.204 | 2898 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+45+24.650265559Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 2m 54.222s | 2025-09-23 05:48:02.204 | 2899 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 211 File: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+45+24.650265559Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 2m 54.223s | 2025-09-23 05:48:02.205 | 2900 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 2m 54.228s | 2025-09-23 05:48:02.210 | 2901 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 2m 54.229s | 2025-09-23 05:48:02.211 | 2902 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 238 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/238 {"round":238,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/238/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 3m 11.573s | 2025-09-23 05:48:19.555 | 3212 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 1 to 4>> | NetworkUtils: | Connection broken: 1 -> 4 | |
| java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.WaitForAcceptReject.transition(WaitForAcceptReject.java:48) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) | |||||||||
| node0 | 3m 11.574s | 2025-09-23 05:48:19.556 | 3231 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 0 to 4>> | NetworkUtils: | Connection broken: 0 -> 4 | |
| java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-23T05:48:19.554491515Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-23T05:48:19.554491515Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 12 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$sendEventsTheyNeed$8(SyncUtils.java:234) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more | |||||||||
| node3 | 3m 11.574s | 2025-09-23 05:48:19.556 | 3200 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 3 to 4>> | NetworkUtils: | Connection broken: 3 -> 4 | |
| java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.SentInitiate.transition(SentInitiate.java:73) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) | |||||||||
| node2 | 3m 11.575s | 2025-09-23 05:48:19.557 | 3234 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 2 to 4>> | NetworkUtils: | Connection broken: 2 -> 4 | |
| java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.SentInitiate.transition(SentInitiate.java:73) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) | |||||||||
| node0 | 3m 53.512s | 2025-09-23 05:49:01.494 | 4118 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 344 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 3m 53.580s | 2025-09-23 05:49:01.562 | 4126 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 344 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 3m 53.722s | 2025-09-23 05:49:01.704 | 4107 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 344 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 3m 53.742s | 2025-09-23 05:49:01.724 | 4084 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 344 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 3m 54.049s | 2025-09-23 05:49:02.031 | 4087 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 344 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/344 | |
| node2 | 3m 54.050s | 2025-09-23 05:49:02.032 | 4088 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 344 | |
| node3 | 3m 54.145s | 2025-09-23 05:49:02.127 | 4111 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 344 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/344 | |
| node3 | 3m 54.145s | 2025-09-23 05:49:02.127 | 4113 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 344 | |
| node2 | 3m 54.152s | 2025-09-23 05:49:02.134 | 4121 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 344 | |
| node2 | 3m 54.154s | 2025-09-23 05:49:02.136 | 4122 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 344 Timestamp: 2025-09-23T05:49:00.447561Z Next consensus number: 8403 Legacy running event hash: d52343758c3056c8af10661929cc3fc283fa82d965cc6d0325557c61b54a681929a4c3049f04fc49426183574f2489fa Legacy running event mnemonic: forum-huge-key-suit Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2099881959 Root hash: 798ff755d927c9c47c500abb0868c0e86cea5878a4805fe4c05c0b876d35131c163fb886a53559a9f5ec7b4ff3042cdf (root) ConsistencyTestingToolState / cradle-betray-swamp-shy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 mouse-swamp-jealous-pistol 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf -1837639509931371316 /3 outside-finger-pink-robust 4 StringLeaf 344 /4 refuse-fetch-raven-bid | |||||||||
| node2 | 3m 54.165s | 2025-09-23 05:49:02.147 | 4123 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+45+24.650265559Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 3m 54.165s | 2025-09-23 05:49:02.147 | 4124 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 317 File: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+45+24.650265559Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 3m 54.165s | 2025-09-23 05:49:02.147 | 4125 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 3m 54.172s | 2025-09-23 05:49:02.154 | 4126 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 3m 54.172s | 2025-09-23 05:49:02.154 | 4127 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 344 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/344 {"round":344,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/344/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 3m 54.215s | 2025-09-23 05:49:02.197 | 4132 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 344 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/344 | |
| node1 | 3m 54.216s | 2025-09-23 05:49:02.198 | 4133 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 344 | |
| node3 | 3m 54.232s | 2025-09-23 05:49:02.214 | 4153 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 344 | |
| node3 | 3m 54.234s | 2025-09-23 05:49:02.216 | 4170 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 344 Timestamp: 2025-09-23T05:49:00.447561Z Next consensus number: 8403 Legacy running event hash: d52343758c3056c8af10661929cc3fc283fa82d965cc6d0325557c61b54a681929a4c3049f04fc49426183574f2489fa Legacy running event mnemonic: forum-huge-key-suit Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2099881959 Root hash: 798ff755d927c9c47c500abb0868c0e86cea5878a4805fe4c05c0b876d35131c163fb886a53559a9f5ec7b4ff3042cdf (root) ConsistencyTestingToolState / cradle-betray-swamp-shy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 mouse-swamp-jealous-pistol 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf -1837639509931371316 /3 outside-finger-pink-robust 4 StringLeaf 344 /4 refuse-fetch-raven-bid | |||||||||
| node3 | 3m 54.242s | 2025-09-23 05:49:02.224 | 4171 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+45+24.442108548Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 3m 54.242s | 2025-09-23 05:49:02.224 | 4172 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 317 File: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+45+24.442108548Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 3m 54.242s | 2025-09-23 05:49:02.224 | 4173 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 3m 54.248s | 2025-09-23 05:49:02.230 | 4174 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 3m 54.249s | 2025-09-23 05:49:02.231 | 4175 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 344 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/344 {"round":344,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/344/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 3m 54.283s | 2025-09-23 05:49:02.265 | 4121 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 344 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/344 | |
| node0 | 3m 54.284s | 2025-09-23 05:49:02.266 | 4125 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 344 | |
| node1 | 3m 54.301s | 2025-09-23 05:49:02.283 | 4179 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 344 | |
| node1 | 3m 54.302s | 2025-09-23 05:49:02.284 | 4180 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 344 Timestamp: 2025-09-23T05:49:00.447561Z Next consensus number: 8403 Legacy running event hash: d52343758c3056c8af10661929cc3fc283fa82d965cc6d0325557c61b54a681929a4c3049f04fc49426183574f2489fa Legacy running event mnemonic: forum-huge-key-suit Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2099881959 Root hash: 798ff755d927c9c47c500abb0868c0e86cea5878a4805fe4c05c0b876d35131c163fb886a53559a9f5ec7b4ff3042cdf (root) ConsistencyTestingToolState / cradle-betray-swamp-shy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 mouse-swamp-jealous-pistol 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf -1837639509931371316 /3 outside-finger-pink-robust 4 StringLeaf 344 /4 refuse-fetch-raven-bid | |||||||||
| node1 | 3m 54.311s | 2025-09-23 05:49:02.293 | 4181 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+45+24.577363596Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 3m 54.311s | 2025-09-23 05:49:02.293 | 4182 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 317 File: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+45+24.577363596Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 3m 54.312s | 2025-09-23 05:49:02.294 | 4183 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 3m 54.319s | 2025-09-23 05:49:02.301 | 4184 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 3m 54.319s | 2025-09-23 05:49:02.301 | 4185 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 344 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/344 {"round":344,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/344/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 3m 54.365s | 2025-09-23 05:49:02.347 | 4163 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 344 | |
| node0 | 3m 54.367s | 2025-09-23 05:49:02.349 | 4164 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 344 Timestamp: 2025-09-23T05:49:00.447561Z Next consensus number: 8403 Legacy running event hash: d52343758c3056c8af10661929cc3fc283fa82d965cc6d0325557c61b54a681929a4c3049f04fc49426183574f2489fa Legacy running event mnemonic: forum-huge-key-suit Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2099881959 Root hash: 798ff755d927c9c47c500abb0868c0e86cea5878a4805fe4c05c0b876d35131c163fb886a53559a9f5ec7b4ff3042cdf (root) ConsistencyTestingToolState / cradle-betray-swamp-shy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 mouse-swamp-jealous-pistol 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf -1837639509931371316 /3 outside-finger-pink-robust 4 StringLeaf 344 /4 refuse-fetch-raven-bid | |||||||||
| node0 | 3m 54.374s | 2025-09-23 05:49:02.356 | 4165 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+45+24.129682821Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 3m 54.374s | 2025-09-23 05:49:02.356 | 4166 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 317 File: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+45+24.129682821Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 3m 54.375s | 2025-09-23 05:49:02.357 | 4167 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 3m 54.381s | 2025-09-23 05:49:02.363 | 4168 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 3m 54.381s | 2025-09-23 05:49:02.363 | 4169 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 344 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/344 {"round":344,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/344/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 4m 53.391s | 2025-09-23 05:50:01.373 | 5337 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 443 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 4m 53.506s | 2025-09-23 05:50:01.488 | 5325 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 443 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 4m 53.521s | 2025-09-23 05:50:01.503 | 5305 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 443 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 4m 53.550s | 2025-09-23 05:50:01.532 | 5307 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 443 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 4m 53.882s | 2025-09-23 05:50:01.864 | 5308 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 443 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/443 | |
| node1 | 4m 53.882s | 2025-09-23 05:50:01.864 | 5309 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 443 | |
| node3 | 4m 53.957s | 2025-09-23 05:50:01.939 | 5310 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 443 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/443 | |
| node3 | 4m 53.958s | 2025-09-23 05:50:01.940 | 5311 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 443 | |
| node1 | 4m 53.967s | 2025-09-23 05:50:01.949 | 5340 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 443 | |
| node1 | 4m 53.969s | 2025-09-23 05:50:01.951 | 5341 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 443 Timestamp: 2025-09-23T05:50:00.097636Z Next consensus number: 9925 Legacy running event hash: 143a70df4533710d6427795e30b0a9b5f53be75925eaad34616d105d7b021662cfc28c5045612443694f596619c724d1 Legacy running event mnemonic: mansion-brief-please-canvas Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2085009100 Root hash: a9fd921a39a0b3589dda6ffd030171d4a797a201d4b0689904eceac1db3da5dabfe6054f7ca01320eebd591afc1ab86c (root) ConsistencyTestingToolState / scatter-rubber-cross-never 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 version-parade-cloth-alcohol 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 6669061482678719376 /3 voice-crawl-light-peasant 4 StringLeaf 443 /4 buddy-hold-keep-hat | |||||||||
| node1 | 4m 53.976s | 2025-09-23 05:50:01.958 | 5342 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+45+24.577363596Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 4m 53.976s | 2025-09-23 05:50:01.958 | 5343 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 416 File: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+45+24.577363596Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 4m 53.976s | 2025-09-23 05:50:01.958 | 5344 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 4m 53.983s | 2025-09-23 05:50:01.965 | 5345 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 4m 53.984s | 2025-09-23 05:50:01.966 | 5346 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 443 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/443 {"round":443,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/443/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 4m 53.985s | 2025-09-23 05:50:01.967 | 5347 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1 | |
| node2 | 4m 54.027s | 2025-09-23 05:50:02.009 | 5328 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 443 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/443 | |
| node2 | 4m 54.028s | 2025-09-23 05:50:02.010 | 5329 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 443 | |
| node3 | 4m 54.051s | 2025-09-23 05:50:02.033 | 5346 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 443 | |
| node3 | 4m 54.054s | 2025-09-23 05:50:02.036 | 5347 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 443 Timestamp: 2025-09-23T05:50:00.097636Z Next consensus number: 9925 Legacy running event hash: 143a70df4533710d6427795e30b0a9b5f53be75925eaad34616d105d7b021662cfc28c5045612443694f596619c724d1 Legacy running event mnemonic: mansion-brief-please-canvas Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2085009100 Root hash: a9fd921a39a0b3589dda6ffd030171d4a797a201d4b0689904eceac1db3da5dabfe6054f7ca01320eebd591afc1ab86c (root) ConsistencyTestingToolState / scatter-rubber-cross-never 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 version-parade-cloth-alcohol 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 6669061482678719376 /3 voice-crawl-light-peasant 4 StringLeaf 443 /4 buddy-hold-keep-hat | |||||||||
| node3 | 4m 54.062s | 2025-09-23 05:50:02.044 | 5348 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+45+24.442108548Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 4m 54.062s | 2025-09-23 05:50:02.044 | 5349 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 416 File: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+45+24.442108548Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 4m 54.063s | 2025-09-23 05:50:02.045 | 5350 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 4m 54.070s | 2025-09-23 05:50:02.052 | 5351 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 4m 54.071s | 2025-09-23 05:50:02.053 | 5352 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 443 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/443 {"round":443,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/443/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 4m 54.073s | 2025-09-23 05:50:02.055 | 5353 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1 | |
| node2 | 4m 54.114s | 2025-09-23 05:50:02.096 | 5368 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 443 | |
| node2 | 4m 54.116s | 2025-09-23 05:50:02.098 | 5369 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 443 Timestamp: 2025-09-23T05:50:00.097636Z Next consensus number: 9925 Legacy running event hash: 143a70df4533710d6427795e30b0a9b5f53be75925eaad34616d105d7b021662cfc28c5045612443694f596619c724d1 Legacy running event mnemonic: mansion-brief-please-canvas Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2085009100 Root hash: a9fd921a39a0b3589dda6ffd030171d4a797a201d4b0689904eceac1db3da5dabfe6054f7ca01320eebd591afc1ab86c (root) ConsistencyTestingToolState / scatter-rubber-cross-never 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 version-parade-cloth-alcohol 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 6669061482678719376 /3 voice-crawl-light-peasant 4 StringLeaf 443 /4 buddy-hold-keep-hat | |||||||||
| node2 | 4m 54.122s | 2025-09-23 05:50:02.104 | 5370 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+45+24.650265559Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 4m 54.122s | 2025-09-23 05:50:02.104 | 5371 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 416 File: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+45+24.650265559Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 4m 54.122s | 2025-09-23 05:50:02.104 | 5372 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 4m 54.130s | 2025-09-23 05:50:02.112 | 5373 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 4m 54.131s | 2025-09-23 05:50:02.113 | 5374 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 443 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/443 {"round":443,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/443/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 4m 54.132s | 2025-09-23 05:50:02.114 | 5375 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1 | |
| node0 | 4m 54.139s | 2025-09-23 05:50:02.121 | 5350 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 443 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/443 | |
| node0 | 4m 54.140s | 2025-09-23 05:50:02.122 | 5351 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 443 | |
| node0 | 4m 54.221s | 2025-09-23 05:50:02.203 | 5394 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 443 | |
| node0 | 4m 54.222s | 2025-09-23 05:50:02.204 | 5395 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 443 Timestamp: 2025-09-23T05:50:00.097636Z Next consensus number: 9925 Legacy running event hash: 143a70df4533710d6427795e30b0a9b5f53be75925eaad34616d105d7b021662cfc28c5045612443694f596619c724d1 Legacy running event mnemonic: mansion-brief-please-canvas Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2085009100 Root hash: a9fd921a39a0b3589dda6ffd030171d4a797a201d4b0689904eceac1db3da5dabfe6054f7ca01320eebd591afc1ab86c (root) ConsistencyTestingToolState / scatter-rubber-cross-never 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 version-parade-cloth-alcohol 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 6669061482678719376 /3 voice-crawl-light-peasant 4 StringLeaf 443 /4 buddy-hold-keep-hat | |||||||||
| node0 | 4m 54.228s | 2025-09-23 05:50:02.210 | 5396 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+45+24.129682821Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 4m 54.229s | 2025-09-23 05:50:02.211 | 5397 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 416 File: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+45+24.129682821Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 4m 54.229s | 2025-09-23 05:50:02.211 | 5398 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 4m 54.236s | 2025-09-23 05:50:02.218 | 5399 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 4m 54.236s | 2025-09-23 05:50:02.218 | 5400 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 443 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/443 {"round":443,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/443/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 4m 54.238s | 2025-09-23 05:50:02.220 | 5401 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1 | |
| node4 | 5m 49.751s | 2025-09-23 05:50:57.733 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node4 | 5m 49.836s | 2025-09-23 05:50:57.818 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node4 | 5m 49.851s | 2025-09-23 05:50:57.833 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node4 | 5m 49.965s | 2025-09-23 05:50:57.947 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [4] are set to run locally | |
| node4 | 5m 49.972s | 2025-09-23 05:50:57.954 | 5 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | Registering ConsistencyTestingToolState with ConstructableRegistry | |
| node4 | 5m 49.983s | 2025-09-23 05:50:57.965 | 6 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node4 | 5m 50.388s | 2025-09-23 05:50:58.370 | 9 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | ConsistencyTestingToolState is registered with ConstructableRegistry | |
| node4 | 5m 50.389s | 2025-09-23 05:50:58.371 | 10 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node4 | 5m 51.128s | 2025-09-23 05:50:59.110 | 11 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 739ms | |
| node4 | 5m 51.136s | 2025-09-23 05:50:59.118 | 12 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node4 | 5m 51.138s | 2025-09-23 05:50:59.120 | 13 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node4 | 5m 51.173s | 2025-09-23 05:50:59.155 | 14 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node4 | 5m 51.235s | 2025-09-23 05:50:59.217 | 15 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node4 | 5m 51.237s | 2025-09-23 05:50:59.219 | 16 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node4 | 5m 53.252s | 2025-09-23 05:51:01.234 | 17 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node4 | 5m 53.331s | 2025-09-23 05:51:01.313 | 20 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 5m 53.337s | 2025-09-23 05:51:01.319 | 21 | INFO | STARTUP | <main> | StartupStateUtils: | The following saved states were found on disk: | |
| - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/238/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/148/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/56/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1/SignedState.swh | |||||||||
| node4 | 5m 53.337s | 2025-09-23 05:51:01.319 | 22 | INFO | STARTUP | <main> | StartupStateUtils: | Loading latest state from disk. | |
| node4 | 5m 53.338s | 2025-09-23 05:51:01.320 | 23 | INFO | STARTUP | <main> | StartupStateUtils: | Loading signed state from disk: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/238/SignedState.swh | |
| node4 | 5m 53.342s | 2025-09-23 05:51:01.324 | 24 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node4 | 5m 53.346s | 2025-09-23 05:51:01.328 | 25 | INFO | STATE_TO_DISK | <main> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp | |
| node4 | 5m 53.470s | 2025-09-23 05:51:01.452 | 36 | INFO | STARTUP | <main> | StartupStateUtils: | Loaded state's hash is the same as when it was saved. | |
| node4 | 5m 53.473s | 2025-09-23 05:51:01.455 | 37 | INFO | STARTUP | <main> | StartupStateUtils: | Platform has loaded a saved state {"round":238,"consensusTimestamp":"2025-09-23T05:48:00.057986224Z"} [com.swirlds.logging.legacy.payload.SavedStateLoadedPayload] | |
| node4 | 5m 53.475s | 2025-09-23 05:51:01.457 | 40 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 5m 53.476s | 2025-09-23 05:51:01.458 | 43 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node4 | 5m 53.478s | 2025-09-23 05:51:01.460 | 44 | INFO | STARTUP | <main> | AddressBookInitializer: | Using the loaded state's address book and weight values. | |
| node4 | 5m 53.484s | 2025-09-23 05:51:01.466 | 45 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 5m 53.485s | 2025-09-23 05:51:01.467 | 46 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node1 | 5m 53.539s | 2025-09-23 05:51:01.521 | 6444 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 539 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 5m 53.610s | 2025-09-23 05:51:01.592 | 6456 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 539 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 5m 53.629s | 2025-09-23 05:51:01.611 | 6506 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 539 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node0 | 5m 53.675s | 2025-09-23 05:51:01.657 | 6510 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 539 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 5m 53.951s | 2025-09-23 05:51:01.933 | 6457 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 539 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/539 | |
| node1 | 5m 53.951s | 2025-09-23 05:51:01.933 | 6458 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 539 | |
| node2 | 5m 53.985s | 2025-09-23 05:51:01.967 | 6519 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 539 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/539 | |
| node2 | 5m 53.985s | 2025-09-23 05:51:01.967 | 6520 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 539 | |
| node3 | 5m 54.021s | 2025-09-23 05:51:02.003 | 6469 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 539 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/539 | |
| node3 | 5m 54.022s | 2025-09-23 05:51:02.004 | 6470 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 539 | |
| node1 | 5m 54.034s | 2025-09-23 05:51:02.016 | 6497 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 539 | |
| node1 | 5m 54.036s | 2025-09-23 05:51:02.018 | 6498 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 539 Timestamp: 2025-09-23T05:51:00.524811Z Next consensus number: 11479 Legacy running event hash: 87f9e510147f71e2eee4bdcbc75134061d50e4dc283fa5831bccce41d6705395300fa0ea4ba5f0ab25b467d47de2a607 Legacy running event mnemonic: abstract-fun-morning-detail Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -280004188 Root hash: 375a526b874826ce5d90f8d7fa06d066b475cfae4b2f01a3fcf1e9e610fb175c5e43e3bcd09fc54cbeb87f078b5dd512 (root) ConsistencyTestingToolState / twice-estate-blade-supply 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 glue-spread-ice-squirrel 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf -5855548732802215126 /3 fever-action-ten-park 4 StringLeaf 539 /4 sea-faculty-radar-already | |||||||||
| node1 | 5m 54.043s | 2025-09-23 05:51:02.025 | 6499 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+45+24.577363596Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+50+40.368679637Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 5m 54.043s | 2025-09-23 05:51:02.025 | 6500 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 512 File: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+50+40.368679637Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 5m 54.043s | 2025-09-23 05:51:02.025 | 6501 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 5m 54.044s | 2025-09-23 05:51:02.026 | 6502 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 5m 54.044s | 2025-09-23 05:51:02.026 | 6503 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 539 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/539 {"round":539,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/539/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 5m 54.045s | 2025-09-23 05:51:02.027 | 6504 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/56 | |
| node0 | 5m 54.054s | 2025-09-23 05:51:02.036 | 6523 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 539 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/539 | |
| node0 | 5m 54.055s | 2025-09-23 05:51:02.037 | 6524 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 539 | |
| node2 | 5m 54.078s | 2025-09-23 05:51:02.060 | 6563 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 539 | |
| node2 | 5m 54.080s | 2025-09-23 05:51:02.062 | 6564 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 539 Timestamp: 2025-09-23T05:51:00.524811Z Next consensus number: 11479 Legacy running event hash: 87f9e510147f71e2eee4bdcbc75134061d50e4dc283fa5831bccce41d6705395300fa0ea4ba5f0ab25b467d47de2a607 Legacy running event mnemonic: abstract-fun-morning-detail Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -280004188 Root hash: 375a526b874826ce5d90f8d7fa06d066b475cfae4b2f01a3fcf1e9e610fb175c5e43e3bcd09fc54cbeb87f078b5dd512 (root) ConsistencyTestingToolState / twice-estate-blade-supply 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 glue-spread-ice-squirrel 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf -5855548732802215126 /3 fever-action-ten-park 4 StringLeaf 539 /4 sea-faculty-radar-already | |||||||||
| node2 | 5m 54.088s | 2025-09-23 05:51:02.070 | 6565 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+45+24.650265559Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+50+40.409172680Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 5m 54.088s | 2025-09-23 05:51:02.070 | 6566 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 512 File: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+50+40.409172680Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 5m 54.088s | 2025-09-23 05:51:02.070 | 6567 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 5m 54.089s | 2025-09-23 05:51:02.071 | 6568 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 5m 54.089s | 2025-09-23 05:51:02.071 | 6569 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 539 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/539 {"round":539,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/539/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 5m 54.091s | 2025-09-23 05:51:02.073 | 6570 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/56 | |
| node3 | 5m 54.108s | 2025-09-23 05:51:02.090 | 6505 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 539 | |
| node3 | 5m 54.110s | 2025-09-23 05:51:02.092 | 6506 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 539 Timestamp: 2025-09-23T05:51:00.524811Z Next consensus number: 11479 Legacy running event hash: 87f9e510147f71e2eee4bdcbc75134061d50e4dc283fa5831bccce41d6705395300fa0ea4ba5f0ab25b467d47de2a607 Legacy running event mnemonic: abstract-fun-morning-detail Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -280004188 Root hash: 375a526b874826ce5d90f8d7fa06d066b475cfae4b2f01a3fcf1e9e610fb175c5e43e3bcd09fc54cbeb87f078b5dd512 (root) ConsistencyTestingToolState / twice-estate-blade-supply 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 glue-spread-ice-squirrel 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf -5855548732802215126 /3 fever-action-ten-park 4 StringLeaf 539 /4 sea-faculty-radar-already | |||||||||
| node3 | 5m 54.117s | 2025-09-23 05:51:02.099 | 6507 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+45+24.442108548Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+50+40.575689970Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 5m 54.117s | 2025-09-23 05:51:02.099 | 6508 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 512 File: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+50+40.575689970Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 5m 54.117s | 2025-09-23 05:51:02.099 | 6509 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 5m 54.118s | 2025-09-23 05:51:02.100 | 6510 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 5m 54.118s | 2025-09-23 05:51:02.100 | 6511 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 539 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/539 {"round":539,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/539/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 5m 54.120s | 2025-09-23 05:51:02.102 | 6512 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/56 | |
| node0 | 5m 54.144s | 2025-09-23 05:51:02.126 | 6555 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 539 | |
| node0 | 5m 54.145s | 2025-09-23 05:51:02.127 | 6556 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 539 Timestamp: 2025-09-23T05:51:00.524811Z Next consensus number: 11479 Legacy running event hash: 87f9e510147f71e2eee4bdcbc75134061d50e4dc283fa5831bccce41d6705395300fa0ea4ba5f0ab25b467d47de2a607 Legacy running event mnemonic: abstract-fun-morning-detail Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -280004188 Root hash: 375a526b874826ce5d90f8d7fa06d066b475cfae4b2f01a3fcf1e9e610fb175c5e43e3bcd09fc54cbeb87f078b5dd512 (root) ConsistencyTestingToolState / twice-estate-blade-supply 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 glue-spread-ice-squirrel 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf -5855548732802215126 /3 fever-action-ten-park 4 StringLeaf 539 /4 sea-faculty-radar-already | |||||||||
| node0 | 5m 54.152s | 2025-09-23 05:51:02.134 | 6557 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+45+24.129682821Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+50+40.471554370Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node0 | 5m 54.152s | 2025-09-23 05:51:02.134 | 6558 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 512 File: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+50+40.471554370Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node0 | 5m 54.152s | 2025-09-23 05:51:02.134 | 6559 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 5m 54.153s | 2025-09-23 05:51:02.135 | 6560 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 5m 54.153s | 2025-09-23 05:51:02.135 | 6561 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 539 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/539 {"round":539,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/539/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 5m 54.154s | 2025-09-23 05:51:02.136 | 6562 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/56 | |
| node4 | 5m 54.513s | 2025-09-23 05:51:02.495 | 47 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26287275] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=195150, randomLong=824484040376964044, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=46660, randomLong=3008472311534728495, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=992130, data=35, exception=null] OS Health Check Report - Complete (took 1016 ms) | |||||||||
| node4 | 5m 54.540s | 2025-09-23 05:51:02.522 | 48 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node4 | 5m 54.633s | 2025-09-23 05:51:02.615 | 49 | INFO | STARTUP | <main> | PcesUtilities: | Span compaction completed for data/saved/preconsensus-events/4/2025/09/23/2025-09-23T05+45+24.575421851Z_seq0_minr1_maxr501_orgn0.pces, new upper bound is 268 | |
| node4 | 5m 54.636s | 2025-09-23 05:51:02.618 | 50 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node4 | 5m 54.641s | 2025-09-23 05:51:02.623 | 51 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node4 | 5m 54.718s | 2025-09-23 05:51:02.700 | 52 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I+Fdwg==", "port": 30124 }, { "ipAddressV4": "CoAAdw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Ijtt6w==", "port": 30125 }, { "ipAddressV4": "CoAAdg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjoaMw==", "port": 30126 }, { "ipAddressV4": "CoAAeA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "IobWVg==", "port": 30127 }, { "ipAddressV4": "CoAAdQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I8pw6w==", "port": 30128 }, { "ipAddressV4": "CoAAdA==", "port": 30128 }] }] } | |||||||||
| node4 | 5m 54.739s | 2025-09-23 05:51:02.721 | 53 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | State initialized with state long 2930039329747737888. | |
| node4 | 5m 54.740s | 2025-09-23 05:51:02.722 | 54 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | State initialized with 238 rounds handled. | |
| node4 | 5m 54.740s | 2025-09-23 05:51:02.722 | 55 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv | |
| node4 | 5m 54.741s | 2025-09-23 05:51:02.723 | 56 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Log file found. Parsing previous history | |
| node4 | 5m 55.471s | 2025-09-23 05:51:03.453 | 57 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 238 Timestamp: 2025-09-23T05:48:00.057986224Z Next consensus number: 6535 Legacy running event hash: 7586645a270846ebf9c8aa77f4b127f207cb29dd6dc87d32578564d294cd7a2f7e370d8fc336f77b6a309e18d0fa3164 Legacy running event mnemonic: radio-mercy-three-talent Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1594774619 Root hash: 6437ac6f182af9cfad7ea514d636f7554b018fa6fec8eb117e308328e66dc1c9a7b5bb235f80d5416a0fb82a76ce71dd (root) ConsistencyTestingToolState / manage-impulse-aim-shiver 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 position-swarm-music-end 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 2930039329747737888 /3 leisure-recycle-vendor-solve 4 StringLeaf 238 /4 degree-garment-fashion-empower | |||||||||
| node4 | 5m 55.702s | 2025-09-23 05:51:03.684 | 59 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 7586645a270846ebf9c8aa77f4b127f207cb29dd6dc87d32578564d294cd7a2f7e370d8fc336f77b6a309e18d0fa3164 | |
| node4 | 5m 55.715s | 2025-09-23 05:51:03.697 | 60 | INFO | STARTUP | <platformForkJoinThread-4> | Shadowgraph: | Shadowgraph starting from expiration threshold 211 | |
| node4 | 5m 55.722s | 2025-09-23 05:51:03.704 | 62 | INFO | STARTUP | <<start-node-4>> | ConsistencyTestingToolMain: | init called in Main for node 4. | |
| node4 | 5m 55.723s | 2025-09-23 05:51:03.705 | 63 | INFO | STARTUP | <<start-node-4>> | SwirldsPlatform: | Starting platform 4 | |
| node4 | 5m 55.725s | 2025-09-23 05:51:03.707 | 64 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node4 | 5m 55.729s | 2025-09-23 05:51:03.711 | 65 | INFO | STARTUP | <<start-node-4>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node4 | 5m 55.730s | 2025-09-23 05:51:03.712 | 66 | INFO | STARTUP | <<start-node-4>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node4 | 5m 55.731s | 2025-09-23 05:51:03.713 | 67 | INFO | STARTUP | <<start-node-4>> | InputWireChecks: | All input wires have been bound. | |
| node4 | 5m 55.733s | 2025-09-23 05:51:03.715 | 68 | INFO | STARTUP | <<start-node-4>> | SwirldsPlatform: | replaying preconsensus event stream starting at 211 | |
| node4 | 5m 55.738s | 2025-09-23 05:51:03.720 | 69 | INFO | PLATFORM_STATUS | <platformForkJoinThread-4> | DefaultStatusStateMachine: | Platform spent 184.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node4 | 5m 55.959s | 2025-09-23 05:51:03.941 | 70 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:4 H:e7ff5c6ceff6 BR:236), num remaining: 4 | |
| node4 | 5m 55.962s | 2025-09-23 05:51:03.944 | 71 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:3 H:b2c73a872ae3 BR:236), num remaining: 3 | |
| node4 | 5m 55.962s | 2025-09-23 05:51:03.944 | 72 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:0 H:e64eb39a52e3 BR:236), num remaining: 2 | |
| node4 | 5m 55.963s | 2025-09-23 05:51:03.945 | 73 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:1 H:be0f25571b17 BR:237), num remaining: 1 | |
| node4 | 5m 55.964s | 2025-09-23 05:51:03.946 | 74 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:2 H:484b1f6561be BR:236), num remaining: 0 | |
| node4 | 5m 56.127s | 2025-09-23 05:51:04.109 | 152 | INFO | STARTUP | <<start-node-4>> | PcesReplayer: | Replayed 1,595 preconsensus events with max birth round 268. These events contained 4,003 transactions. 29 rounds reached consensus spanning 17.7 seconds of consensus time. The latest round to reach consensus is round 267. Replay took 393.0 milliseconds. | |
| node4 | 5m 56.128s | 2025-09-23 05:51:04.110 | 155 | INFO | STARTUP | <<app: appMain 4>> | ConsistencyTestingToolMain: | run called in Main. | |
| node4 | 5m 56.130s | 2025-09-23 05:51:04.112 | 156 | INFO | PLATFORM_STATUS | <platformForkJoinThread-3> | DefaultStatusStateMachine: | Platform spent 390.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node4 | 5m 57.006s | 2025-09-23 05:51:04.988 | 327 | INFO | PLATFORM_STATUS | <platformForkJoinThread-2> | DefaultStatusStateMachine: | Platform spent 874.0 ms in OBSERVING. Now in BEHIND | |
| node4 | 5m 57.006s | 2025-09-23 05:51:04.988 | 328 | INFO | RECONNECT | <platformForkJoinThread-5> | ReconnectController: | Starting ReconnectController | |
| node4 | 5m 57.007s | 2025-09-23 05:51:04.989 | 329 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | ReconnectPlatformHelperImpl: | Preparing for reconnect, stopping gossip | |
| node4 | 5m 57.008s | 2025-09-23 05:51:04.990 | 330 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | ReconnectPlatformHelperImpl: | Preparing for reconnect, start clearing queues | |
| node4 | 5m 57.009s | 2025-09-23 05:51:04.991 | 331 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | ReconnectPlatformHelperImpl: | Queues have been cleared | |
| node4 | 5m 57.010s | 2025-09-23 05:51:04.992 | 332 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | ReconnectController: | waiting for reconnect connection | |
| node4 | 5m 57.010s | 2025-09-23 05:51:04.992 | 333 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | ReconnectController: | acquired reconnect connection | |
| node3 | 5m 57.252s | 2025-09-23 05:51:05.234 | 6560 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 3 to 4>> | ReconnectTeacher: | Starting reconnect in the role of the sender {"receiving":false,"nodeId":3,"otherNodeId":4,"round":543} [com.swirlds.logging.legacy.payload.ReconnectStartPayload] | |
| node3 | 5m 57.253s | 2025-09-23 05:51:05.235 | 6561 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 3 to 4>> | ReconnectTeacher: | The following state will be sent to the learner: | |
| Round: 543 Timestamp: 2025-09-23T05:51:02.924955Z Next consensus number: 11540 Legacy running event hash: 656a1c5a824dee87beddc92e5baf3e6e611ded7813ed39014df94e093661a51176a8a6d7481af5e464e3cdfceae74c16 Legacy running event mnemonic: pull-mercy-reopen-connect Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1021967253 Root hash: b018ff1fe96c369d24b56af96c8f5894e74ce07123463298545920040356204c777453abc5535c96195089c402940175 (root) ConsistencyTestingToolState / wonder-bring-spoon-squirrel 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 taxi-mirror-broken-shop 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf -7136128642908826753 /3 concert-hazard-manual-vintage 4 StringLeaf 543 /4 enforce-caution-wife-elbow | |||||||||
| node3 | 5m 57.254s | 2025-09-23 05:51:05.236 | 6562 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 3 to 4>> | ReconnectTeacher: | Sending signatures from nodes 0, 1, 3 (signing weight = 37500000000/50000000000) for state hash b018ff1fe96c369d24b56af96c8f5894e74ce07123463298545920040356204c777453abc5535c96195089c402940175 | |
| node3 | 5m 57.254s | 2025-09-23 05:51:05.236 | 6563 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 3 to 4>> | ReconnectTeacher: | Starting synchronization in the role of the sender. | |
| node3 | 5m 57.260s | 2025-09-23 05:51:05.242 | 6564 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 3 to 4>> | TeachingSynchronizer: | sending tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route [] | |
| node3 | 5m 57.270s | 2025-09-23 05:51:05.252 | 6565 | INFO | RECONNECT | <<work group teaching-synchronizer: async-input-stream #0>> | AsyncInputStream: | com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@1ec7ccae start run() | |
| node4 | 5m 57.311s | 2025-09-23 05:51:05.293 | 334 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | ReconnectSyncHelper: | Starting reconnect in role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":3,"round":266} [com.swirlds.logging.legacy.payload.ReconnectStartPayload] | |
| node4 | 5m 57.313s | 2025-09-23 05:51:05.295 | 335 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | ReconnectLearner: | Receiving signed state signatures | |
| node4 | 5m 57.325s | 2025-09-23 05:51:05.307 | 336 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | ReconnectLearner: | Received signatures from nodes 0, 1, 3 | |
| node4 | 5m 57.327s | 2025-09-23 05:51:05.309 | 337 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | learner calls receiveTree() | |
| node4 | 5m 57.328s | 2025-09-23 05:51:05.310 | 338 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | synchronizing tree | |
| node4 | 5m 57.328s | 2025-09-23 05:51:05.310 | 339 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | receiving tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route [] | |
| node4 | 5m 57.333s | 2025-09-23 05:51:05.315 | 340 | INFO | RECONNECT | <<work group learning-synchronizer: async-input-stream #0>> | AsyncInputStream: | com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@4dfef17d start run() | |
| node4 | 5m 57.342s | 2025-09-23 05:51:05.324 | 341 | INFO | STARTUP | <<work group learning-synchronizer: async-input-stream #0>> | ConsistencyTestingToolState: | New State Constructed. | |
| node3 | 5m 57.424s | 2025-09-23 05:51:05.406 | 6584 | INFO | RECONNECT | <<work group teaching-synchronizer: async-input-stream #0>> | AsyncInputStream: | com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@1ec7ccae finish run() | |
| node3 | 5m 57.425s | 2025-09-23 05:51:05.407 | 6585 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 3 to 4>> | TeachingSynchronizer: | finished sending tree | |
| node3 | 5m 57.425s | 2025-09-23 05:51:05.407 | 6586 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 3 to 4>> | TeachingSynchronizer: | sending tree rooted at com.swirlds.virtualmap.VirtualMap with route [2] | |
| node3 | 5m 57.426s | 2025-09-23 05:51:05.408 | 6587 | INFO | RECONNECT | <<work group teaching-synchronizer: async-input-stream #0>> | AsyncInputStream: | com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@2a99c12c start run() | |
| node4 | 5m 57.536s | 2025-09-23 05:51:05.518 | 365 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushTask: | learner thread finished the learning loop for the current subtree | |
| node4 | 5m 57.537s | 2025-09-23 05:51:05.519 | 366 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushTask: | learner thread closed input, output, and view for the current subtree | |
| node4 | 5m 57.537s | 2025-09-23 05:51:05.519 | 367 | INFO | RECONNECT | <<work group learning-synchronizer: async-input-stream #0>> | AsyncInputStream: | com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@4dfef17d finish run() | |
| node4 | 5m 57.538s | 2025-09-23 05:51:05.520 | 368 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | received tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route [] | |
| node4 | 5m 57.538s | 2025-09-23 05:51:05.520 | 369 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | receiving tree rooted at com.swirlds.virtualmap.VirtualMap with route [2] | |
| node4 | 5m 57.541s | 2025-09-23 05:51:05.523 | 370 | INFO | RECONNECT | <<work group learning-synchronizer: async-input-stream #0>> | AsyncInputStream: | com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@399f735c start run() | |
| node4 | 5m 57.609s | 2025-09-23 05:51:05.591 | 371 | INFO | RECONNECT | <<work group learning-synchronizer: async-input-stream #0>> | ReconnectNodeRemover: | setPathInformation(): firstLeafPath: 1 -> 1, lastLeafPath: 1 -> 1 | |
| node4 | 5m 57.609s | 2025-09-23 05:51:05.591 | 372 | INFO | RECONNECT | <<work group learning-synchronizer: async-input-stream #0>> | ReconnectNodeRemover: | setPathInformation(): done | |
| node4 | 5m 57.611s | 2025-09-23 05:51:05.593 | 373 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushTask: | learner thread finished the learning loop for the current subtree | |
| node4 | 5m 57.612s | 2025-09-23 05:51:05.594 | 374 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushVirtualTreeView: | call nodeRemover.allNodesReceived() | |
| node4 | 5m 57.612s | 2025-09-23 05:51:05.594 | 375 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | ReconnectNodeRemover: | allNodesReceived() | |
| node4 | 5m 57.612s | 2025-09-23 05:51:05.594 | 376 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | ReconnectNodeRemover: | allNodesReceived(): done | |
| node4 | 5m 57.612s | 2025-09-23 05:51:05.594 | 377 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushVirtualTreeView: | call root.endLearnerReconnect() | |
| node4 | 5m 57.613s | 2025-09-23 05:51:05.595 | 378 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | VirtualMap: | call reconnectIterator.close() | |
| node4 | 5m 57.613s | 2025-09-23 05:51:05.595 | 379 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | VirtualMap: | call setHashPrivate() | |
| node3 | 5m 57.681s | 2025-09-23 05:51:05.663 | 6591 | INFO | RECONNECT | <<work group teaching-synchronizer: async-input-stream #0>> | AsyncInputStream: | com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@2a99c12c finish run() | |
| node3 | 5m 57.682s | 2025-09-23 05:51:05.664 | 6592 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 3 to 4>> | TeachingSynchronizer: | finished sending tree | |
| node3 | 5m 57.685s | 2025-09-23 05:51:05.667 | 6595 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 3 to 4>> | ReconnectTeacher: | Finished synchronization in the role of the sender. | |
| node4 | 5m 57.764s | 2025-09-23 05:51:05.746 | 389 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | VirtualMap: | call postInit() | |
| node4 | 5m 57.765s | 2025-09-23 05:51:05.747 | 391 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | VirtualMap: | endLearnerReconnect() complete | |
| node4 | 5m 57.765s | 2025-09-23 05:51:05.747 | 392 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushVirtualTreeView: | close() complete | |
| node4 | 5m 57.765s | 2025-09-23 05:51:05.747 | 393 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushTask: | learner thread closed input, output, and view for the current subtree | |
| node4 | 5m 57.766s | 2025-09-23 05:51:05.748 | 394 | INFO | RECONNECT | <<work group learning-synchronizer: async-input-stream #0>> | AsyncInputStream: | com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@399f735c finish run() | |
| node4 | 5m 57.766s | 2025-09-23 05:51:05.748 | 395 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | received tree rooted at com.swirlds.virtualmap.VirtualMap with route [2] | |
| node4 | 5m 57.766s | 2025-09-23 05:51:05.748 | 396 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | synchronization complete | |
| node4 | 5m 57.767s | 2025-09-23 05:51:05.749 | 397 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | learner calls initialize() | |
| node4 | 5m 57.767s | 2025-09-23 05:51:05.749 | 398 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | initializing tree | |
| node4 | 5m 57.767s | 2025-09-23 05:51:05.749 | 399 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | initialization complete | |
| node4 | 5m 57.767s | 2025-09-23 05:51:05.749 | 400 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | learner calls hash() | |
| node4 | 5m 57.767s | 2025-09-23 05:51:05.749 | 401 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | hashing tree | |
| node4 | 5m 57.768s | 2025-09-23 05:51:05.750 | 402 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | hashing complete | |
| node4 | 5m 57.768s | 2025-09-23 05:51:05.750 | 403 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | learner calls logStatistics() | |
| node4 | 5m 57.771s | 2025-09-23 05:51:05.753 | 404 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | Finished synchronization {"timeInSeconds":0.438,"hashTimeInSeconds":0.0,"initializationTimeInSeconds":0.0,"totalNodes":12,"leafNodes":7,"redundantLeafNodes":4,"internalNodes":5,"redundantInternalNodes":2} [com.swirlds.logging.legacy.payload.SynchronizationCompletePayload] | |
| node4 | 5m 57.772s | 2025-09-23 05:51:05.754 | 405 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | ReconnectMapMetrics: transfersFromTeacher=12; transfersFromLearner=10; internalHashes=0; internalCleanHashes=0; internalData=0; internalCleanData=0; leafHashes=3; leafCleanHashes=3; leafData=7; leafCleanData=4 | |
| node4 | 5m 57.772s | 2025-09-23 05:51:05.754 | 406 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | LearningSynchronizer: | learner is done synchronizing | |
| node4 | 5m 57.774s | 2025-09-23 05:51:05.756 | 407 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | ReconnectLearner: | Reconnect data usage report {"dataMegabytes":0.0060558319091796875} [com.swirlds.logging.legacy.payload.ReconnectDataUsagePayload] | |
| node4 | 5m 57.777s | 2025-09-23 05:51:05.759 | 408 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | ReconnectSyncHelper: | Finished reconnect in the role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":3,"round":543,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload] | |
| node4 | 5m 57.778s | 2025-09-23 05:51:05.760 | 409 | INFO | RECONNECT | <<reconnect: reconnect-controller>> | ReconnectSyncHelper: | Information for state received during reconnect: | |
| Round: 543 Timestamp: 2025-09-23T05:51:02.924955Z Next consensus number: 11540 Legacy running event hash: 656a1c5a824dee87beddc92e5baf3e6e611ded7813ed39014df94e093661a51176a8a6d7481af5e464e3cdfceae74c16 Legacy running event mnemonic: pull-mercy-reopen-connect Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1021967253 Root hash: b018ff1fe96c369d24b56af96c8f5894e74ce07123463298545920040356204c777453abc5535c96195089c402940175 (root) ConsistencyTestingToolState / wonder-bring-spoon-squirrel 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 taxi-mirror-broken-shop 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf -7136128642908826753 /3 concert-hazard-manual-vintage 4 StringLeaf 543 /4 enforce-caution-wife-elbow | |||||||||
| node4 | 5m 57.779s | 2025-09-23 05:51:05.761 | 411 | DEBUG | RECONNECT | <<reconnect: reconnect-controller>> | ReconnectStateLoader: | `loadReconnectState` : reloading state | |
| node4 | 5m 57.779s | 2025-09-23 05:51:05.761 | 412 | INFO | STARTUP | <<reconnect: reconnect-controller>> | ConsistencyTestingToolState: | State initialized with state long -7136128642908826753. | |
| node4 | 5m 57.779s | 2025-09-23 05:51:05.761 | 413 | INFO | STARTUP | <<reconnect: reconnect-controller>> | ConsistencyTestingToolState: | State initialized with 543 rounds handled. | |
| node4 | 5m 57.779s | 2025-09-23 05:51:05.761 | 414 | INFO | STARTUP | <<reconnect: reconnect-controller>> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv | |
| node4 | 5m 57.779s | 2025-09-23 05:51:05.761 | 415 | INFO | STARTUP | <<reconnect: reconnect-controller>> | TransactionHandlingHistory: | Log file found. Parsing previous history | |
| node4 | 5m 57.813s | 2025-09-23 05:51:05.795 | 422 | INFO | STATE_TO_DISK | <<reconnect: reconnect-controller>> | DefaultSavedStateController: | Signed state from round 543 created, will eventually be written to disk, for reason: RECONNECT | |
| node4 | 5m 57.813s | 2025-09-23 05:51:05.795 | 423 | INFO | PLATFORM_STATUS | <platformForkJoinThread-3> | DefaultStatusStateMachine: | Platform spent 807.0 ms in BEHIND. Now in RECONNECT_COMPLETE | |
| node4 | 5m 57.814s | 2025-09-23 05:51:05.796 | 424 | INFO | STARTUP | <platformForkJoinThread-3> | Shadowgraph: | Shadowgraph starting from expiration threshold 516 | |
| node4 | 5m 57.816s | 2025-09-23 05:51:05.798 | 427 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 543 state to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/543 | |
| node4 | 5m 57.817s | 2025-09-23 05:51:05.799 | 428 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/2 for round 543 | |
| node4 | 5m 57.831s | 2025-09-23 05:51:05.813 | 438 | INFO | EVENT_STREAM | <<reconnect: reconnect-controller>> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 656a1c5a824dee87beddc92e5baf3e6e611ded7813ed39014df94e093661a51176a8a6d7481af5e464e3cdfceae74c16 | |
| node4 | 5m 57.832s | 2025-09-23 05:51:05.814 | 439 | INFO | STARTUP | <platformForkJoinThread-5> | PcesFileManager: | Due to recent operations on this node, the local preconsensus event stream will have a discontinuity. The last file with the old origin round is 2025-09-23T05+45+24.575421851Z_seq0_minr1_maxr268_orgn0.pces. All future files will have an origin round of 543. | |
| node3 | 5m 57.848s | 2025-09-23 05:51:05.830 | 6596 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 3 to 4>> | ReconnectTeacher: | Finished reconnect in the role of the sender. {"receiving":false,"nodeId":3,"otherNodeId":4,"round":543,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload] | |
| node4 | 5m 57.958s | 2025-09-23 05:51:05.940 | 473 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/2 for round 543 | |
| node4 | 5m 57.961s | 2025-09-23 05:51:05.943 | 474 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 543 Timestamp: 2025-09-23T05:51:02.924955Z Next consensus number: 11540 Legacy running event hash: 656a1c5a824dee87beddc92e5baf3e6e611ded7813ed39014df94e093661a51176a8a6d7481af5e464e3cdfceae74c16 Legacy running event mnemonic: pull-mercy-reopen-connect Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1021967253 Root hash: b018ff1fe96c369d24b56af96c8f5894e74ce07123463298545920040356204c777453abc5535c96195089c402940175 (root) ConsistencyTestingToolState / wonder-bring-spoon-squirrel 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 taxi-mirror-broken-shop 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf -7136128642908826753 /3 concert-hazard-manual-vintage 4 StringLeaf 543 /4 enforce-caution-wife-elbow | |||||||||
| node4 | 5m 57.994s | 2025-09-23 05:51:05.976 | 475 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/4/2025/09/23/2025-09-23T05+45+24.575421851Z_seq0_minr1_maxr268_orgn0.pces | |||||||||
| node4 | 5m 57.994s | 2025-09-23 05:51:05.976 | 476 | WARN | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | No preconsensus event files meeting specified criteria found to copy. Lower bound: 516 | |
| node4 | 5m 58.000s | 2025-09-23 05:51:05.982 | 477 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 543 to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/543 {"round":543,"freezeState":false,"reason":"RECONNECT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/543/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 5m 58.003s | 2025-09-23 05:51:05.985 | 478 | INFO | PLATFORM_STATUS | <platformForkJoinThread-7> | DefaultStatusStateMachine: | Platform spent 189.0 ms in RECONNECT_COMPLETE. Now in CHECKING | |
| node4 | 5m 58.733s | 2025-09-23 05:51:06.715 | 479 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ] | |
| node4 | 5m 58.736s | 2025-09-23 05:51:06.718 | 480 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node4 | 5m 58.909s | 2025-09-23 05:51:06.891 | 481 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:3 H:6ac926997ea7 BR:541), num remaining: 3 | |
| node4 | 5m 58.914s | 2025-09-23 05:51:06.896 | 482 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:1 H:194689725346 BR:541), num remaining: 2 | |
| node4 | 5m 58.915s | 2025-09-23 05:51:06.897 | 483 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:2 H:0d6c5c47f22c BR:541), num remaining: 1 | |
| node4 | 5m 58.916s | 2025-09-23 05:51:06.898 | 484 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:0 H:657113d534e0 BR:542), num remaining: 0 | |
| node4 | 6m 2.639s | 2025-09-23 05:51:10.621 | 574 | INFO | PLATFORM_STATUS | <platformForkJoinThread-6> | DefaultStatusStateMachine: | Platform spent 4.6 s in CHECKING. Now in ACTIVE | |
| node0 | 6m 53.872s | 2025-09-23 05:52:01.854 | 7669 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 635 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node4 | 6m 53.946s | 2025-09-23 05:52:01.928 | 1491 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 635 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 6m 54.011s | 2025-09-23 05:52:01.993 | 7630 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 635 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 6m 54.029s | 2025-09-23 05:52:02.011 | 7587 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 635 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 6m 54.084s | 2025-09-23 05:52:02.066 | 7653 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 635 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 6m 54.218s | 2025-09-23 05:52:02.200 | 7590 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 635 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/635 | |
| node1 | 6m 54.219s | 2025-09-23 05:52:02.201 | 7591 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 635 | |
| node0 | 6m 54.288s | 2025-09-23 05:52:02.270 | 7672 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 635 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/635 | |
| node0 | 6m 54.289s | 2025-09-23 05:52:02.271 | 7673 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 635 | |
| node1 | 6m 54.305s | 2025-09-23 05:52:02.287 | 7626 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 635 | |
| node1 | 6m 54.308s | 2025-09-23 05:52:02.290 | 7627 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 635 Timestamp: 2025-09-23T05:52:00.583682Z Next consensus number: 13859 Legacy running event hash: 533064ce18c6ac561ef4854522b89bf277e864a0e03edee565b0fd2a00a4aea1361d217f29126b8ede6295e13b06f75f Legacy running event mnemonic: gravity-oil-boat-inspire Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1637780202 Root hash: 0099c0e2d0a559cd56d6a60a055651929d525f453ef19c98cf7db7d5d0eaa992543c0116dd5922790567d1136d4552f9 (root) ConsistencyTestingToolState / basket-toilet-twin-defy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 twelve-day-guitar-horn 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf -5508992547220305676 /3 debate-badge-fuel-visit 4 StringLeaf 635 /4 burden-couch-female-purity | |||||||||
| node1 | 6m 54.315s | 2025-09-23 05:52:02.297 | 7628 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+45+24.577363596Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+50+40.368679637Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 6m 54.318s | 2025-09-23 05:52:02.300 | 7629 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 608 File: data/saved/preconsensus-events/1/2025/09/23/2025-09-23T05+50+40.368679637Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 6m 54.319s | 2025-09-23 05:52:02.301 | 7630 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 6m 54.321s | 2025-09-23 05:52:02.303 | 7631 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 6m 54.321s | 2025-09-23 05:52:02.303 | 7632 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 635 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/635 {"round":635,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/635/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 6m 54.323s | 2025-09-23 05:52:02.305 | 7633 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/148 | |
| node4 | 6m 54.368s | 2025-09-23 05:52:02.350 | 1494 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 635 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/635 | |
| node4 | 6m 54.369s | 2025-09-23 05:52:02.351 | 1495 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 635 | |
| node0 | 6m 54.379s | 2025-09-23 05:52:02.361 | 7704 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 635 | |
| node0 | 6m 54.381s | 2025-09-23 05:52:02.363 | 7705 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 635 Timestamp: 2025-09-23T05:52:00.583682Z Next consensus number: 13859 Legacy running event hash: 533064ce18c6ac561ef4854522b89bf277e864a0e03edee565b0fd2a00a4aea1361d217f29126b8ede6295e13b06f75f Legacy running event mnemonic: gravity-oil-boat-inspire Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1637780202 Root hash: 0099c0e2d0a559cd56d6a60a055651929d525f453ef19c98cf7db7d5d0eaa992543c0116dd5922790567d1136d4552f9 (root) ConsistencyTestingToolState / basket-toilet-twin-defy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 twelve-day-guitar-horn 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf -5508992547220305676 /3 debate-badge-fuel-visit 4 StringLeaf 635 /4 burden-couch-female-purity | |||||||||
| node0 | 6m 54.387s | 2025-09-23 05:52:02.369 | 7706 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+45+24.129682821Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+50+40.471554370Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node0 | 6m 54.390s | 2025-09-23 05:52:02.372 | 7707 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 608 File: data/saved/preconsensus-events/0/2025/09/23/2025-09-23T05+50+40.471554370Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node0 | 6m 54.390s | 2025-09-23 05:52:02.372 | 7708 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 6m 54.392s | 2025-09-23 05:52:02.374 | 7709 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 6m 54.393s | 2025-09-23 05:52:02.375 | 7710 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 635 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/635 {"round":635,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/635/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 6m 54.394s | 2025-09-23 05:52:02.376 | 7711 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/148 | |
| node3 | 6m 54.422s | 2025-09-23 05:52:02.404 | 7633 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 635 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/635 | |
| node3 | 6m 54.423s | 2025-09-23 05:52:02.405 | 7634 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/44 for round 635 | |
| node4 | 6m 54.465s | 2025-09-23 05:52:02.447 | 1529 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 635 | |
| node4 | 6m 54.467s | 2025-09-23 05:52:02.449 | 1530 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 635 Timestamp: 2025-09-23T05:52:00.583682Z Next consensus number: 13859 Legacy running event hash: 533064ce18c6ac561ef4854522b89bf277e864a0e03edee565b0fd2a00a4aea1361d217f29126b8ede6295e13b06f75f Legacy running event mnemonic: gravity-oil-boat-inspire Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1637780202 Root hash: 0099c0e2d0a559cd56d6a60a055651929d525f453ef19c98cf7db7d5d0eaa992543c0116dd5922790567d1136d4552f9 (root) ConsistencyTestingToolState / basket-toilet-twin-defy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 twelve-day-guitar-horn 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf -5508992547220305676 /3 debate-badge-fuel-visit 4 StringLeaf 635 /4 burden-couch-female-purity | |||||||||
| node4 | 6m 54.475s | 2025-09-23 05:52:02.457 | 1531 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/4/2025/09/23/2025-09-23T05+51+06.328581851Z_seq1_minr516_maxr1016_orgn543.pces Last file: data/saved/preconsensus-events/4/2025/09/23/2025-09-23T05+45+24.575421851Z_seq0_minr1_maxr268_orgn0.pces | |||||||||
| node4 | 6m 54.476s | 2025-09-23 05:52:02.458 | 1532 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 608 File: data/saved/preconsensus-events/4/2025/09/23/2025-09-23T05+51+06.328581851Z_seq1_minr516_maxr1016_orgn543.pces | |||||||||
| node4 | 6m 54.476s | 2025-09-23 05:52:02.458 | 1533 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 6m 54.479s | 2025-09-23 05:52:02.461 | 1534 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 6m 54.479s | 2025-09-23 05:52:02.461 | 1535 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 635 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/635 {"round":635,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/635/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 6m 54.481s | 2025-09-23 05:52:02.463 | 1536 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 | |
| node2 | 6m 54.507s | 2025-09-23 05:52:02.489 | 7656 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 635 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/635 | |
| node2 | 6m 54.508s | 2025-09-23 05:52:02.490 | 7658 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 635 | |
| node3 | 6m 54.510s | 2025-09-23 05:52:02.492 | 7677 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/44 for round 635 | |
| node3 | 6m 54.513s | 2025-09-23 05:52:02.495 | 7678 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 635 Timestamp: 2025-09-23T05:52:00.583682Z Next consensus number: 13859 Legacy running event hash: 533064ce18c6ac561ef4854522b89bf277e864a0e03edee565b0fd2a00a4aea1361d217f29126b8ede6295e13b06f75f Legacy running event mnemonic: gravity-oil-boat-inspire Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1637780202 Root hash: 0099c0e2d0a559cd56d6a60a055651929d525f453ef19c98cf7db7d5d0eaa992543c0116dd5922790567d1136d4552f9 (root) ConsistencyTestingToolState / basket-toilet-twin-defy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 twelve-day-guitar-horn 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf -5508992547220305676 /3 debate-badge-fuel-visit 4 StringLeaf 635 /4 burden-couch-female-purity | |||||||||
| node3 | 6m 54.519s | 2025-09-23 05:52:02.501 | 7679 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+45+24.442108548Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+50+40.575689970Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 6m 54.519s | 2025-09-23 05:52:02.501 | 7680 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 608 File: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+50+40.575689970Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 6m 54.519s | 2025-09-23 05:52:02.501 | 7681 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 6m 54.522s | 2025-09-23 05:52:02.504 | 7682 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 6m 54.522s | 2025-09-23 05:52:02.504 | 7683 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 635 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/635 {"round":635,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/635/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 6m 54.523s | 2025-09-23 05:52:02.505 | 7684 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/148 | |
| node2 | 6m 54.591s | 2025-09-23 05:52:02.573 | 7696 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 635 | |
| node2 | 6m 54.593s | 2025-09-23 05:52:02.575 | 7705 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 635 Timestamp: 2025-09-23T05:52:00.583682Z Next consensus number: 13859 Legacy running event hash: 533064ce18c6ac561ef4854522b89bf277e864a0e03edee565b0fd2a00a4aea1361d217f29126b8ede6295e13b06f75f Legacy running event mnemonic: gravity-oil-boat-inspire Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1637780202 Root hash: 0099c0e2d0a559cd56d6a60a055651929d525f453ef19c98cf7db7d5d0eaa992543c0116dd5922790567d1136d4552f9 (root) ConsistencyTestingToolState / basket-toilet-twin-defy 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 twelve-day-guitar-horn 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf -5508992547220305676 /3 debate-badge-fuel-visit 4 StringLeaf 635 /4 burden-couch-female-purity | |||||||||
| node2 | 6m 54.599s | 2025-09-23 05:52:02.581 | 7706 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+45+24.650265559Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+50+40.409172680Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 6m 54.601s | 2025-09-23 05:52:02.583 | 7707 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 608 File: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+50+40.409172680Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 6m 54.602s | 2025-09-23 05:52:02.584 | 7708 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 6m 54.604s | 2025-09-23 05:52:02.586 | 7709 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 6m 54.604s | 2025-09-23 05:52:02.586 | 7710 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 635 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/635 {"round":635,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/635/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 6m 54.606s | 2025-09-23 05:52:02.588 | 7711 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/148 | |
| node3 | 7m 53.340s | 2025-09-23 05:53:01.322 | 8744 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 730 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 7m 53.486s | 2025-09-23 05:53:01.468 | 8669 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 730 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 7m 53.492s | 2025-09-23 05:53:01.474 | 8753 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 730 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 7m 53.819s | 2025-09-23 05:53:01.801 | 8672 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith0 1 to 0>> | NetworkUtils: | Connection broken: 1 <- 0 | |
| java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-23T05:53:01.800372779Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-23T05:53:01.800372779Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 12 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$sendEventsTheyNeed$8(SyncUtils.java:234) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more | |||||||||
| node2 | 7m 53.819s | 2025-09-23 05:53:01.801 | 8756 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith0 2 to 0>> | NetworkUtils: | Connection broken: 2 <- 0 | |
| java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-23T05:53:01.800608415Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-23T05:53:01.800608415Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more | |||||||||
| node3 | 7m 53.820s | 2025-09-23 05:53:01.802 | 8747 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith0 3 to 0>> | NetworkUtils: | Connection broken: 3 <- 0 | |
| java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-23T05:53:01.800673361Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-23T05:53:01.800673361Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:180) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readBoolean(DataInputStream.java:255) at org.hiero.base.io.streams.AugmentedDataInputStream.readBoolean(AugmentedDataInputStream.java:137) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readMyTipsTheyHave$7(SyncUtils.java:163) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more | |||||||||
| node1 | 7m 53.885s | 2025-09-23 05:53:01.867 | 8673 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 1 to 4>> | NetworkUtils: | Connection broken: 1 -> 4 | |
| java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-23T05:53:01.867195383Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-23T05:53:01.867195383Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 12 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$sendEventsTheyNeed$8(SyncUtils.java:234) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more | |||||||||
| node3 | 7m 53.885s | 2025-09-23 05:53:01.867 | 8748 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 3 to 4>> | NetworkUtils: | Connection broken: 3 -> 4 | |
| java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-23T05:53:01.867472440Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-23T05:53:01.867472440Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more | |||||||||
| node2 | 7m 53.886s | 2025-09-23 05:53:01.868 | 8757 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 2 to 4>> | NetworkUtils: | Connection broken: 2 -> 4 | |
| java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-23T05:53:01.867986757Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-23T05:53:01.867986757Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more | |||||||||
| node2 | 7m 53.935s | 2025-09-23 05:53:01.917 | 8758 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 730 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/730 | |
| node2 | 7m 53.936s | 2025-09-23 05:53:01.918 | 8759 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 730 | |
| node3 | 7m 53.979s | 2025-09-23 05:53:01.961 | 8749 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 730 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/730 | |
| node3 | 7m 53.980s | 2025-09-23 05:53:01.962 | 8750 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 730 | |
| node2 | 7m 54.024s | 2025-09-23 05:53:02.006 | 8794 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 730 | |
| node2 | 7m 54.026s | 2025-09-23 05:53:02.008 | 8795 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 730 Timestamp: 2025-09-23T05:53:00.064522373Z Next consensus number: 16330 Legacy running event hash: 3960983badabc9f0821c16815b1c28274d9b962d2e0b727bb4c9754d3cbef14be45d21050b60388d581c578a67e15cef Legacy running event mnemonic: clown-weather-enjoy-search Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -415506090 Root hash: c1ab2d3077d07cc16f852142add8308d7c5a38c5723edcde4c8bedff3f23ece2faf918670996e1ddc4429ecc47fc086c (root) ConsistencyTestingToolState / skirt-plastic-wheel-oblige 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 burden-emerge-control-clock 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 6840133897623644085 /3 silver-file-balance-vital 4 StringLeaf 730 /4 train-betray-elegant-suffer | |||||||||
| node2 | 7m 54.034s | 2025-09-23 05:53:02.016 | 8796 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+45+24.650265559Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+50+40.409172680Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 7m 54.034s | 2025-09-23 05:53:02.016 | 8797 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 703 File: data/saved/preconsensus-events/2/2025/09/23/2025-09-23T05+50+40.409172680Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 7m 54.034s | 2025-09-23 05:53:02.016 | 8798 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 7m 54.039s | 2025-09-23 05:53:02.021 | 8799 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 7m 54.039s | 2025-09-23 05:53:02.021 | 8800 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 730 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/730 {"round":730,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/730/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 7m 54.041s | 2025-09-23 05:53:02.023 | 8801 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/238 | |
| node3 | 7m 54.083s | 2025-09-23 05:53:02.065 | 8785 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 730 | |
| node3 | 7m 54.085s | 2025-09-23 05:53:02.067 | 8786 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 730 Timestamp: 2025-09-23T05:53:00.064522373Z Next consensus number: 16330 Legacy running event hash: 3960983badabc9f0821c16815b1c28274d9b962d2e0b727bb4c9754d3cbef14be45d21050b60388d581c578a67e15cef Legacy running event mnemonic: clown-weather-enjoy-search Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -415506090 Root hash: c1ab2d3077d07cc16f852142add8308d7c5a38c5723edcde4c8bedff3f23ece2faf918670996e1ddc4429ecc47fc086c (root) ConsistencyTestingToolState / skirt-plastic-wheel-oblige 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 burden-emerge-control-clock 1 SingletonNode RosterService.ROSTER_STATE /1 spell-wash-shove-street 2 VirtualMap RosterService.ROSTERS /2 output-ball-unable-stem 3 StringLeaf 6840133897623644085 /3 silver-file-balance-vital 4 StringLeaf 730 /4 train-betray-elegant-suffer | |||||||||
| node3 | 7m 54.095s | 2025-09-23 05:53:02.077 | 8787 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+45+24.442108548Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+50+40.575689970Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 7m 54.095s | 2025-09-23 05:53:02.077 | 8788 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 703 File: data/saved/preconsensus-events/3/2025/09/23/2025-09-23T05+50+40.575689970Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 7m 54.095s | 2025-09-23 05:53:02.077 | 8789 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 7m 54.099s | 2025-09-23 05:53:02.081 | 8790 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 7m 54.100s | 2025-09-23 05:53:02.082 | 8791 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 730 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/730 {"round":730,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/730/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 7m 54.101s | 2025-09-23 05:53:02.083 | 8792 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/238 | |
| node2 | 7m 54.196s | 2025-09-23 05:53:02.178 | 8802 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith1 2 to 1>> | NetworkUtils: | Connection broken: 2 <- 1 | |
| java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-23T05:53:02.178367190Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-23T05:53:02.178367190Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more | |||||||||
| node3 | 7m 54.196s | 2025-09-23 05:53:02.178 | 8793 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith1 3 to 1>> | NetworkUtils: | Connection broken: 3 <- 1 | |
| java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.SentInitiate.transition(SentInitiate.java:73) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) | |||||||||