| node0 | 0.000ns | 2025-11-17 21:06:25.987 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node0 | 86.000ms | 2025-11-17 21:06:26.073 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node0 | 101.000ms | 2025-11-17 21:06:26.088 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node1 | 170.000ms | 2025-11-17 21:06:26.157 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node0 | 209.000ms | 2025-11-17 21:06:26.196 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [0] are set to run locally | |
| node0 | 237.000ms | 2025-11-17 21:06:26.224 | 5 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node1 | 262.000ms | 2025-11-17 21:06:26.249 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node1 | 278.000ms | 2025-11-17 21:06:26.265 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node1 | 396.000ms | 2025-11-17 21:06:26.383 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [1] are set to run locally | |
| node1 | 429.000ms | 2025-11-17 21:06:26.416 | 5 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node3 | 1.103s | 2025-11-17 21:06:27.090 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node3 | 1.195s | 2025-11-17 21:06:27.182 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node3 | 1.212s | 2025-11-17 21:06:27.199 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node3 | 1.325s | 2025-11-17 21:06:27.312 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [3] are set to run locally | |
| node3 | 1.355s | 2025-11-17 21:06:27.342 | 5 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node0 | 1.508s | 2025-11-17 21:06:27.495 | 6 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 1270ms | |
| node0 | 1.518s | 2025-11-17 21:06:27.505 | 7 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node0 | 1.520s | 2025-11-17 21:06:27.507 | 8 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node0 | 1.561s | 2025-11-17 21:06:27.548 | 9 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node0 | 1.620s | 2025-11-17 21:06:27.607 | 10 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node0 | 1.621s | 2025-11-17 21:06:27.608 | 11 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node4 | 1.730s | 2025-11-17 21:06:27.717 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node1 | 1.758s | 2025-11-17 21:06:27.745 | 6 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 1328ms | |
| node1 | 1.768s | 2025-11-17 21:06:27.755 | 7 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node1 | 1.771s | 2025-11-17 21:06:27.758 | 8 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node1 | 1.814s | 2025-11-17 21:06:27.801 | 9 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node4 | 1.823s | 2025-11-17 21:06:27.810 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node4 | 1.840s | 2025-11-17 21:06:27.827 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node1 | 1.890s | 2025-11-17 21:06:27.877 | 10 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node1 | 1.890s | 2025-11-17 21:06:27.877 | 11 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node4 | 1.956s | 2025-11-17 21:06:27.943 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [4] are set to run locally | |
| node4 | 1.988s | 2025-11-17 21:06:27.975 | 5 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node0 | 2.403s | 2025-11-17 21:06:28.390 | 12 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node0 | 2.488s | 2025-11-17 21:06:28.475 | 15 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node0 | 2.491s | 2025-11-17 21:06:28.478 | 16 | INFO | STARTUP | <main> | StartupStateUtils: | No saved states were found on disk. | |
| node0 | 2.524s | 2025-11-17 21:06:28.511 | 21 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node3 | 2.659s | 2025-11-17 21:06:28.646 | 6 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 1303ms | |
| node3 | 2.668s | 2025-11-17 21:06:28.655 | 7 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node3 | 2.671s | 2025-11-17 21:06:28.658 | 8 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node3 | 2.709s | 2025-11-17 21:06:28.696 | 9 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node1 | 2.721s | 2025-11-17 21:06:28.708 | 12 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node3 | 2.775s | 2025-11-17 21:06:28.762 | 10 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node3 | 2.776s | 2025-11-17 21:06:28.763 | 11 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node1 | 2.813s | 2025-11-17 21:06:28.800 | 15 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node1 | 2.815s | 2025-11-17 21:06:28.802 | 16 | INFO | STARTUP | <main> | StartupStateUtils: | No saved states were found on disk. | |
| node1 | 2.850s | 2025-11-17 21:06:28.837 | 21 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node2 | 3.193s | 2025-11-17 21:06:29.180 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node0 | 3.265s | 2025-11-17 21:06:29.252 | 24 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node0 | 3.266s | 2025-11-17 21:06:29.253 | 27 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node0 | 3.272s | 2025-11-17 21:06:29.259 | 28 | INFO | STARTUP | <main> | AddressBookInitializer: | Starting from genesis: using the config address book. | |
| node0 | 3.281s | 2025-11-17 21:06:29.268 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node0 | 3.283s | 2025-11-17 21:06:29.270 | 30 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node2 | 3.288s | 2025-11-17 21:06:29.275 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node2 | 3.304s | 2025-11-17 21:06:29.291 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node2 | 3.424s | 2025-11-17 21:06:29.411 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [2] are set to run locally | |
| node2 | 3.455s | 2025-11-17 21:06:29.442 | 5 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node4 | 3.602s | 2025-11-17 21:06:29.589 | 6 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 1613ms | |
| node4 | 3.613s | 2025-11-17 21:06:29.600 | 7 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node4 | 3.616s | 2025-11-17 21:06:29.603 | 8 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node3 | 3.618s | 2025-11-17 21:06:29.605 | 12 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node1 | 3.629s | 2025-11-17 21:06:29.616 | 24 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node1 | 3.631s | 2025-11-17 21:06:29.618 | 27 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node1 | 3.638s | 2025-11-17 21:06:29.625 | 28 | INFO | STARTUP | <main> | AddressBookInitializer: | Starting from genesis: using the config address book. | |
| node1 | 3.648s | 2025-11-17 21:06:29.635 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node1 | 3.650s | 2025-11-17 21:06:29.637 | 30 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 3.656s | 2025-11-17 21:06:29.643 | 9 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node3 | 3.716s | 2025-11-17 21:06:29.703 | 15 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node3 | 3.719s | 2025-11-17 21:06:29.706 | 16 | INFO | STARTUP | <main> | StartupStateUtils: | No saved states were found on disk. | |
| node4 | 3.724s | 2025-11-17 21:06:29.711 | 10 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node4 | 3.724s | 2025-11-17 21:06:29.711 | 11 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node3 | 3.758s | 2025-11-17 21:06:29.745 | 21 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node0 | 4.409s | 2025-11-17 21:06:30.396 | 31 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26335490] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=244600, randomLong=-8004791795026607491, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=9060, randomLong=7099035726573307191, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1342629, data=35, exception=null] OS Health Check Report - Complete (took 1022 ms) | |||||||||
| node0 | 4.439s | 2025-11-17 21:06:30.426 | 32 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node0 | 4.447s | 2025-11-17 21:06:30.434 | 33 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node0 | 4.449s | 2025-11-17 21:06:30.436 | 34 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node0 | 4.532s | 2025-11-17 21:06:30.519 | 35 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "IjgVaw==", "port": 30124 }, { "ipAddressV4": "CoAAGA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "I8DbAg==", "port": 30125 }, { "ipAddressV4": "CoAAGQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "IjlJoA==", "port": 30126 }, { "ipAddressV4": "CoAAHA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "aJuKEQ==", "port": 30127 }, { "ipAddressV4": "CoAAGg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "IhuUGg==", "port": 30128 }, { "ipAddressV4": "CoAAGw==", "port": 30128 }] }] } | |||||||||
| node3 | 4.554s | 2025-11-17 21:06:30.541 | 24 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 4.556s | 2025-11-17 21:06:30.543 | 12 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node0 | 4.557s | 2025-11-17 21:06:30.544 | 36 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/0/ConsistencyTestLog.csv | |
| node0 | 4.558s | 2025-11-17 21:06:30.545 | 37 | INFO | STARTUP | <main> | TransactionHandlingHistory: | No log file found. Starting without any previous history | |
| node3 | 4.558s | 2025-11-17 21:06:30.545 | 27 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node3 | 4.565s | 2025-11-17 21:06:30.552 | 28 | INFO | STARTUP | <main> | AddressBookInitializer: | Starting from genesis: using the config address book. | |
| node0 | 4.571s | 2025-11-17 21:06:30.558 | 38 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: d1414917bf3d1ee8f982919e8ccc864e4703dfa42b094f2ed7c395ba1463b55b5894c4eb9d7e3860dd5e55677daf79af (root) VirtualMap state / subject-carry-blur-tackle | |||||||||
| node0 | 4.574s | 2025-11-17 21:06:30.561 | 40 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Starting the ReconnectController | |
| node3 | 4.576s | 2025-11-17 21:06:30.563 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node3 | 4.579s | 2025-11-17 21:06:30.566 | 30 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 4.651s | 2025-11-17 21:06:30.638 | 15 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 4.653s | 2025-11-17 21:06:30.640 | 16 | INFO | STARTUP | <main> | StartupStateUtils: | No saved states were found on disk. | |
| node4 | 4.698s | 2025-11-17 21:06:30.685 | 21 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node1 | 4.771s | 2025-11-17 21:06:30.758 | 31 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26364066] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=332380, randomLong=-6370784305512029925, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=8850, randomLong=9048434270270047458, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1318691, data=35, exception=null] OS Health Check Report - Complete (took 1025 ms) | |||||||||
| node1 | 4.805s | 2025-11-17 21:06:30.792 | 32 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node0 | 4.810s | 2025-11-17 21:06:30.797 | 41 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b | |
| node1 | 4.812s | 2025-11-17 21:06:30.799 | 33 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node0 | 4.814s | 2025-11-17 21:06:30.801 | 42 | INFO | STARTUP | <platformForkJoinThread-2> | Shadowgraph: | Shadowgraph starting from expiration threshold 1 | |
| node1 | 4.814s | 2025-11-17 21:06:30.801 | 34 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node0 | 4.818s | 2025-11-17 21:06:30.805 | 43 | INFO | STARTUP | <<start-node-0>> | ConsistencyTestingToolMain: | init called in Main for node 0. | |
| node0 | 4.819s | 2025-11-17 21:06:30.806 | 44 | INFO | STARTUP | <<start-node-0>> | SwirldsPlatform: | Starting platform 0 | |
| node0 | 4.820s | 2025-11-17 21:06:30.807 | 45 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node0 | 4.823s | 2025-11-17 21:06:30.810 | 46 | INFO | STARTUP | <<start-node-0>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node0 | 4.824s | 2025-11-17 21:06:30.811 | 47 | INFO | STARTUP | <<start-node-0>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node0 | 4.825s | 2025-11-17 21:06:30.812 | 48 | INFO | STARTUP | <<start-node-0>> | InputWireChecks: | All input wires have been bound. | |
| node0 | 4.826s | 2025-11-17 21:06:30.813 | 49 | WARN | STARTUP | <<start-node-0>> | PcesFileTracker: | No preconsensus event files available | |
| node0 | 4.827s | 2025-11-17 21:06:30.814 | 50 | INFO | STARTUP | <<start-node-0>> | SwirldsPlatform: | replaying preconsensus event stream starting at 0 | |
| node0 | 4.829s | 2025-11-17 21:06:30.816 | 51 | INFO | STARTUP | <<start-node-0>> | PcesReplayer: | Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds. | |
| node0 | 4.830s | 2025-11-17 21:06:30.817 | 52 | INFO | STARTUP | <<app: appMain 0>> | ConsistencyTestingToolMain: | run called in Main. | |
| node0 | 4.832s | 2025-11-17 21:06:30.819 | 53 | INFO | PLATFORM_STATUS | <platformForkJoinThread-4> | StatusStateMachine: | Platform spent 203.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node0 | 4.837s | 2025-11-17 21:06:30.824 | 54 | INFO | PLATFORM_STATUS | <platformForkJoinThread-4> | StatusStateMachine: | Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node1 | 4.902s | 2025-11-17 21:06:30.889 | 35 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "IjgVaw==", "port": 30124 }, { "ipAddressV4": "CoAAGA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "I8DbAg==", "port": 30125 }, { "ipAddressV4": "CoAAGQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "IjlJoA==", "port": 30126 }, { "ipAddressV4": "CoAAHA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "aJuKEQ==", "port": 30127 }, { "ipAddressV4": "CoAAGg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "IhuUGg==", "port": 30128 }, { "ipAddressV4": "CoAAGw==", "port": 30128 }] }] } | |||||||||
| node1 | 4.926s | 2025-11-17 21:06:30.913 | 36 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/1/ConsistencyTestLog.csv | |
| node1 | 4.926s | 2025-11-17 21:06:30.913 | 37 | INFO | STARTUP | <main> | TransactionHandlingHistory: | No log file found. Starting without any previous history | |
| node2 | 4.935s | 2025-11-17 21:06:30.922 | 6 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 1479ms | |
| node1 | 4.939s | 2025-11-17 21:06:30.926 | 38 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: d1414917bf3d1ee8f982919e8ccc864e4703dfa42b094f2ed7c395ba1463b55b5894c4eb9d7e3860dd5e55677daf79af (root) VirtualMap state / subject-carry-blur-tackle | |||||||||
| node1 | 4.942s | 2025-11-17 21:06:30.929 | 40 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Starting the ReconnectController | |
| node2 | 4.943s | 2025-11-17 21:06:30.930 | 7 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node2 | 4.946s | 2025-11-17 21:06:30.933 | 8 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node2 | 4.984s | 2025-11-17 21:06:30.971 | 9 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node2 | 5.062s | 2025-11-17 21:06:31.049 | 10 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node2 | 5.064s | 2025-11-17 21:06:31.051 | 11 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node1 | 5.181s | 2025-11-17 21:06:31.168 | 41 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b | |
| node1 | 5.185s | 2025-11-17 21:06:31.172 | 42 | INFO | STARTUP | <platformForkJoinThread-2> | Shadowgraph: | Shadowgraph starting from expiration threshold 1 | |
| node1 | 5.189s | 2025-11-17 21:06:31.176 | 43 | INFO | STARTUP | <<start-node-1>> | ConsistencyTestingToolMain: | init called in Main for node 1. | |
| node1 | 5.190s | 2025-11-17 21:06:31.177 | 44 | INFO | STARTUP | <<start-node-1>> | SwirldsPlatform: | Starting platform 1 | |
| node1 | 5.191s | 2025-11-17 21:06:31.178 | 45 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node1 | 5.195s | 2025-11-17 21:06:31.182 | 46 | INFO | STARTUP | <<start-node-1>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node1 | 5.196s | 2025-11-17 21:06:31.183 | 47 | INFO | STARTUP | <<start-node-1>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node1 | 5.196s | 2025-11-17 21:06:31.183 | 48 | INFO | STARTUP | <<start-node-1>> | InputWireChecks: | All input wires have been bound. | |
| node1 | 5.198s | 2025-11-17 21:06:31.185 | 49 | WARN | STARTUP | <<start-node-1>> | PcesFileTracker: | No preconsensus event files available | |
| node1 | 5.198s | 2025-11-17 21:06:31.185 | 50 | INFO | STARTUP | <<start-node-1>> | SwirldsPlatform: | replaying preconsensus event stream starting at 0 | |
| node1 | 5.200s | 2025-11-17 21:06:31.187 | 51 | INFO | STARTUP | <<start-node-1>> | PcesReplayer: | Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds. | |
| node1 | 5.201s | 2025-11-17 21:06:31.188 | 52 | INFO | STARTUP | <<app: appMain 1>> | ConsistencyTestingToolMain: | run called in Main. | |
| node1 | 5.203s | 2025-11-17 21:06:31.190 | 53 | INFO | PLATFORM_STATUS | <platformForkJoinThread-1> | StatusStateMachine: | Platform spent 203.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node1 | 5.209s | 2025-11-17 21:06:31.196 | 54 | INFO | PLATFORM_STATUS | <platformForkJoinThread-1> | StatusStateMachine: | Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node4 | 5.515s | 2025-11-17 21:06:31.502 | 24 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 5.518s | 2025-11-17 21:06:31.505 | 27 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node4 | 5.525s | 2025-11-17 21:06:31.512 | 28 | INFO | STARTUP | <main> | AddressBookInitializer: | Starting from genesis: using the config address book. | |
| node4 | 5.537s | 2025-11-17 21:06:31.524 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 5.540s | 2025-11-17 21:06:31.527 | 30 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node3 | 5.703s | 2025-11-17 21:06:31.690 | 31 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26109711] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=198540, randomLong=-8618848767515482521, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=14570, randomLong=879991410491646821, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1406480, data=35, exception=null] OS Health Check Report - Complete (took 1030 ms) | |||||||||
| node3 | 5.735s | 2025-11-17 21:06:31.722 | 32 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node3 | 5.744s | 2025-11-17 21:06:31.731 | 33 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node3 | 5.746s | 2025-11-17 21:06:31.733 | 34 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node3 | 5.832s | 2025-11-17 21:06:31.819 | 35 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "IjgVaw==", "port": 30124 }, { "ipAddressV4": "CoAAGA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "I8DbAg==", "port": 30125 }, { "ipAddressV4": "CoAAGQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "IjlJoA==", "port": 30126 }, { "ipAddressV4": "CoAAHA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "aJuKEQ==", "port": 30127 }, { "ipAddressV4": "CoAAGg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "IhuUGg==", "port": 30128 }, { "ipAddressV4": "CoAAGw==", "port": 30128 }] }] } | |||||||||
| node3 | 5.858s | 2025-11-17 21:06:31.845 | 36 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/3/ConsistencyTestLog.csv | |
| node3 | 5.859s | 2025-11-17 21:06:31.846 | 37 | INFO | STARTUP | <main> | TransactionHandlingHistory: | No log file found. Starting without any previous history | |
| node3 | 5.874s | 2025-11-17 21:06:31.861 | 38 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: d1414917bf3d1ee8f982919e8ccc864e4703dfa42b094f2ed7c395ba1463b55b5894c4eb9d7e3860dd5e55677daf79af (root) VirtualMap state / subject-carry-blur-tackle | |||||||||
| node3 | 5.877s | 2025-11-17 21:06:31.864 | 40 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Starting the ReconnectController | |
| node2 | 5.901s | 2025-11-17 21:06:31.888 | 12 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node2 | 5.995s | 2025-11-17 21:06:31.982 | 15 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node2 | 5.998s | 2025-11-17 21:06:31.985 | 16 | INFO | STARTUP | <main> | StartupStateUtils: | No saved states were found on disk. | |
| node2 | 6.035s | 2025-11-17 21:06:32.022 | 21 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node3 | 6.107s | 2025-11-17 21:06:32.094 | 41 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b | |
| node3 | 6.112s | 2025-11-17 21:06:32.099 | 42 | INFO | STARTUP | <platformForkJoinThread-2> | Shadowgraph: | Shadowgraph starting from expiration threshold 1 | |
| node3 | 6.117s | 2025-11-17 21:06:32.104 | 43 | INFO | STARTUP | <<start-node-3>> | ConsistencyTestingToolMain: | init called in Main for node 3. | |
| node3 | 6.117s | 2025-11-17 21:06:32.104 | 44 | INFO | STARTUP | <<start-node-3>> | SwirldsPlatform: | Starting platform 3 | |
| node3 | 6.119s | 2025-11-17 21:06:32.106 | 45 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node3 | 6.122s | 2025-11-17 21:06:32.109 | 46 | INFO | STARTUP | <<start-node-3>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node3 | 6.123s | 2025-11-17 21:06:32.110 | 47 | INFO | STARTUP | <<start-node-3>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node3 | 6.124s | 2025-11-17 21:06:32.111 | 48 | INFO | STARTUP | <<start-node-3>> | InputWireChecks: | All input wires have been bound. | |
| node3 | 6.125s | 2025-11-17 21:06:32.112 | 49 | WARN | STARTUP | <<start-node-3>> | PcesFileTracker: | No preconsensus event files available | |
| node3 | 6.126s | 2025-11-17 21:06:32.113 | 50 | INFO | STARTUP | <<start-node-3>> | SwirldsPlatform: | replaying preconsensus event stream starting at 0 | |
| node3 | 6.127s | 2025-11-17 21:06:32.114 | 51 | INFO | STARTUP | <<start-node-3>> | PcesReplayer: | Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds. | |
| node3 | 6.128s | 2025-11-17 21:06:32.115 | 52 | INFO | STARTUP | <<app: appMain 3>> | ConsistencyTestingToolMain: | run called in Main. | |
| node3 | 6.131s | 2025-11-17 21:06:32.118 | 53 | INFO | PLATFORM_STATUS | <platformForkJoinThread-5> | StatusStateMachine: | Platform spent 194.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node3 | 6.135s | 2025-11-17 21:06:32.122 | 54 | INFO | PLATFORM_STATUS | <platformForkJoinThread-5> | StatusStateMachine: | Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node4 | 6.666s | 2025-11-17 21:06:32.653 | 31 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26140941] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=271700, randomLong=-961550723549281759, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=11730, randomLong=-2204984410370273533, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1576629, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms) | |||||||||
| node4 | 6.698s | 2025-11-17 21:06:32.685 | 32 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node4 | 6.707s | 2025-11-17 21:06:32.694 | 33 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node4 | 6.709s | 2025-11-17 21:06:32.696 | 34 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node4 | 6.799s | 2025-11-17 21:06:32.786 | 35 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "IjgVaw==", "port": 30124 }, { "ipAddressV4": "CoAAGA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "I8DbAg==", "port": 30125 }, { "ipAddressV4": "CoAAGQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "IjlJoA==", "port": 30126 }, { "ipAddressV4": "CoAAHA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "aJuKEQ==", "port": 30127 }, { "ipAddressV4": "CoAAGg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "IhuUGg==", "port": 30128 }, { "ipAddressV4": "CoAAGw==", "port": 30128 }] }] } | |||||||||
| node4 | 6.826s | 2025-11-17 21:06:32.813 | 36 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv | |
| node4 | 6.827s | 2025-11-17 21:06:32.814 | 37 | INFO | STARTUP | <main> | TransactionHandlingHistory: | No log file found. Starting without any previous history | |
| node4 | 6.841s | 2025-11-17 21:06:32.828 | 38 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: d1414917bf3d1ee8f982919e8ccc864e4703dfa42b094f2ed7c395ba1463b55b5894c4eb9d7e3860dd5e55677daf79af (root) VirtualMap state / subject-carry-blur-tackle | |||||||||
| node4 | 6.845s | 2025-11-17 21:06:32.832 | 40 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Starting the ReconnectController | |
| node2 | 6.877s | 2025-11-17 21:06:32.864 | 24 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node2 | 6.880s | 2025-11-17 21:06:32.867 | 27 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node2 | 6.887s | 2025-11-17 21:06:32.874 | 28 | INFO | STARTUP | <main> | AddressBookInitializer: | Starting from genesis: using the config address book. | |
| node2 | 6.900s | 2025-11-17 21:06:32.887 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node2 | 6.903s | 2025-11-17 21:06:32.890 | 30 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 7.054s | 2025-11-17 21:06:33.041 | 41 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b | |
| node4 | 7.059s | 2025-11-17 21:06:33.046 | 42 | INFO | STARTUP | <platformForkJoinThread-2> | Shadowgraph: | Shadowgraph starting from expiration threshold 1 | |
| node4 | 7.063s | 2025-11-17 21:06:33.050 | 43 | INFO | STARTUP | <<start-node-4>> | ConsistencyTestingToolMain: | init called in Main for node 4. | |
| node4 | 7.064s | 2025-11-17 21:06:33.051 | 44 | INFO | STARTUP | <<start-node-4>> | SwirldsPlatform: | Starting platform 4 | |
| node4 | 7.065s | 2025-11-17 21:06:33.052 | 45 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node4 | 7.069s | 2025-11-17 21:06:33.056 | 46 | INFO | STARTUP | <<start-node-4>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node4 | 7.070s | 2025-11-17 21:06:33.057 | 47 | INFO | STARTUP | <<start-node-4>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node4 | 7.071s | 2025-11-17 21:06:33.058 | 48 | INFO | STARTUP | <<start-node-4>> | InputWireChecks: | All input wires have been bound. | |
| node4 | 7.073s | 2025-11-17 21:06:33.060 | 49 | WARN | STARTUP | <<start-node-4>> | PcesFileTracker: | No preconsensus event files available | |
| node4 | 7.073s | 2025-11-17 21:06:33.060 | 50 | INFO | STARTUP | <<start-node-4>> | SwirldsPlatform: | replaying preconsensus event stream starting at 0 | |
| node4 | 7.075s | 2025-11-17 21:06:33.062 | 51 | INFO | STARTUP | <<start-node-4>> | PcesReplayer: | Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds. | |
| node4 | 7.076s | 2025-11-17 21:06:33.063 | 52 | INFO | STARTUP | <<app: appMain 4>> | ConsistencyTestingToolMain: | run called in Main. | |
| node4 | 7.079s | 2025-11-17 21:06:33.066 | 53 | INFO | PLATFORM_STATUS | <platformForkJoinThread-2> | StatusStateMachine: | Platform spent 175.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node4 | 7.083s | 2025-11-17 21:06:33.070 | 54 | INFO | PLATFORM_STATUS | <platformForkJoinThread-2> | StatusStateMachine: | Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node0 | 7.830s | 2025-11-17 21:06:33.817 | 55 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting0.csv' ] | |
| node0 | 7.836s | 2025-11-17 21:06:33.823 | 56 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node2 | 8.038s | 2025-11-17 21:06:34.025 | 31 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26345460] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=224870, randomLong=3274252237001121867, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=12470, randomLong=1412257815537753590, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1341680, data=35, exception=null] OS Health Check Report - Complete (took 1023 ms) | |||||||||
| node2 | 8.069s | 2025-11-17 21:06:34.056 | 32 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node2 | 8.078s | 2025-11-17 21:06:34.065 | 33 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node2 | 8.080s | 2025-11-17 21:06:34.067 | 34 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node2 | 8.165s | 2025-11-17 21:06:34.152 | 35 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "IjgVaw==", "port": 30124 }, { "ipAddressV4": "CoAAGA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "I8DbAg==", "port": 30125 }, { "ipAddressV4": "CoAAGQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "IjlJoA==", "port": 30126 }, { "ipAddressV4": "CoAAHA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "aJuKEQ==", "port": 30127 }, { "ipAddressV4": "CoAAGg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "IhuUGg==", "port": 30128 }, { "ipAddressV4": "CoAAGw==", "port": 30128 }] }] } | |||||||||
| node2 | 8.191s | 2025-11-17 21:06:34.178 | 36 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/2/ConsistencyTestLog.csv | |
| node2 | 8.191s | 2025-11-17 21:06:34.178 | 37 | INFO | STARTUP | <main> | TransactionHandlingHistory: | No log file found. Starting without any previous history | |
| node1 | 8.203s | 2025-11-17 21:06:34.190 | 55 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting1.csv' ] | |
| node2 | 8.205s | 2025-11-17 21:06:34.192 | 38 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: d1414917bf3d1ee8f982919e8ccc864e4703dfa42b094f2ed7c395ba1463b55b5894c4eb9d7e3860dd5e55677daf79af (root) VirtualMap state / subject-carry-blur-tackle | |||||||||
| node1 | 8.206s | 2025-11-17 21:06:34.193 | 56 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node2 | 8.208s | 2025-11-17 21:06:34.195 | 40 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Starting the ReconnectController | |
| node2 | 8.428s | 2025-11-17 21:06:34.415 | 41 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b | |
| node2 | 8.432s | 2025-11-17 21:06:34.419 | 42 | INFO | STARTUP | <platformForkJoinThread-2> | Shadowgraph: | Shadowgraph starting from expiration threshold 1 | |
| node2 | 8.437s | 2025-11-17 21:06:34.424 | 43 | INFO | STARTUP | <<start-node-2>> | ConsistencyTestingToolMain: | init called in Main for node 2. | |
| node2 | 8.438s | 2025-11-17 21:06:34.425 | 44 | INFO | STARTUP | <<start-node-2>> | SwirldsPlatform: | Starting platform 2 | |
| node2 | 8.439s | 2025-11-17 21:06:34.426 | 45 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node2 | 8.442s | 2025-11-17 21:06:34.429 | 46 | INFO | STARTUP | <<start-node-2>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node2 | 8.443s | 2025-11-17 21:06:34.430 | 47 | INFO | STARTUP | <<start-node-2>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node2 | 8.444s | 2025-11-17 21:06:34.431 | 48 | INFO | STARTUP | <<start-node-2>> | InputWireChecks: | All input wires have been bound. | |
| node2 | 8.445s | 2025-11-17 21:06:34.432 | 49 | WARN | STARTUP | <<start-node-2>> | PcesFileTracker: | No preconsensus event files available | |
| node2 | 8.446s | 2025-11-17 21:06:34.433 | 50 | INFO | STARTUP | <<start-node-2>> | SwirldsPlatform: | replaying preconsensus event stream starting at 0 | |
| node2 | 8.448s | 2025-11-17 21:06:34.435 | 51 | INFO | STARTUP | <<start-node-2>> | PcesReplayer: | Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds. | |
| node2 | 8.449s | 2025-11-17 21:06:34.436 | 52 | INFO | STARTUP | <<app: appMain 2>> | ConsistencyTestingToolMain: | run called in Main. | |
| node2 | 8.451s | 2025-11-17 21:06:34.438 | 53 | INFO | PLATFORM_STATUS | <platformForkJoinThread-7> | StatusStateMachine: | Platform spent 184.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node2 | 8.457s | 2025-11-17 21:06:34.444 | 54 | INFO | PLATFORM_STATUS | <platformForkJoinThread-7> | StatusStateMachine: | Platform spent 5.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node3 | 9.129s | 2025-11-17 21:06:35.116 | 55 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting3.csv' ] | |
| node3 | 9.131s | 2025-11-17 21:06:35.118 | 56 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node4 | 10.078s | 2025-11-17 21:06:36.065 | 55 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ] | |
| node4 | 10.081s | 2025-11-17 21:06:36.068 | 56 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node2 | 11.450s | 2025-11-17 21:06:37.437 | 55 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting2.csv' ] | |
| node2 | 11.453s | 2025-11-17 21:06:37.440 | 56 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node0 | 14.926s | 2025-11-17 21:06:40.913 | 57 | INFO | PLATFORM_STATUS | <platformForkJoinThread-2> | StatusStateMachine: | Platform spent 10.1 s in OBSERVING. Now in CHECKING | |
| node1 | 15.297s | 2025-11-17 21:06:41.284 | 57 | INFO | PLATFORM_STATUS | <platformForkJoinThread-3> | StatusStateMachine: | Platform spent 10.1 s in OBSERVING. Now in CHECKING | |
| node3 | 16.225s | 2025-11-17 21:06:42.212 | 57 | INFO | PLATFORM_STATUS | <platformForkJoinThread-2> | StatusStateMachine: | Platform spent 10.1 s in OBSERVING. Now in CHECKING | |
| node4 | 17.173s | 2025-11-17 21:06:43.160 | 57 | INFO | PLATFORM_STATUS | <platformForkJoinThread-3> | StatusStateMachine: | Platform spent 10.1 s in OBSERVING. Now in CHECKING | |
| node3 | 17.294s | 2025-11-17 21:06:43.281 | 59 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS | |
| node4 | 17.433s | 2025-11-17 21:06:43.420 | 59 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS | |
| node1 | 17.442s | 2025-11-17 21:06:43.429 | 59 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS | |
| node0 | 17.451s | 2025-11-17 21:06:43.438 | 58 | INFO | PLATFORM_STATUS | <platformForkJoinThread-4> | StatusStateMachine: | Platform spent 2.5 s in CHECKING. Now in ACTIVE | |
| node0 | 17.453s | 2025-11-17 21:06:43.440 | 60 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS | |
| node2 | 17.554s | 2025-11-17 21:06:43.541 | 58 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS | |
| node3 | 17.661s | 2025-11-17 21:06:43.648 | 74 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1 | |
| node3 | 17.663s | 2025-11-17 21:06:43.650 | 75 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@14d0c33d | |
| node1 | 17.691s | 2025-11-17 21:06:43.678 | 74 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1 | |
| node1 | 17.693s | 2025-11-17 21:06:43.680 | 75 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@2edb15dc | |
| node4 | 17.728s | 2025-11-17 21:06:43.715 | 74 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 | |
| node4 | 17.731s | 2025-11-17 21:06:43.718 | 75 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@42091eea | |
| node0 | 17.762s | 2025-11-17 21:06:43.749 | 75 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1 | |
| node0 | 17.764s | 2025-11-17 21:06:43.751 | 76 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@5d67603c | |
| node2 | 17.798s | 2025-11-17 21:06:43.785 | 73 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1 | |
| node2 | 17.800s | 2025-11-17 21:06:43.787 | 74 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@5d67603c | |
| node3 | 17.909s | 2025-11-17 21:06:43.896 | 105 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@14d0c33d | |
| node3 | 17.912s | 2025-11-17 21:06:43.899 | 106 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 1 Timestamp: 2025-11-17T21:06:41.114955817Z Next consensus number: 1 Legacy running event hash: 736a07880730c900e8894d811dc50c09c9924dd9abc363def2e48e6884b33728e02345e1f0cdc99d8ab0f3e191b3555c Legacy running event mnemonic: hood-valve-toward-theme Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: b726ad6c45ad078d1e4654376ff5be4104a21267b9439b8e6c9e64bf0b046160fac41ffacfe62d839d9881af121767bb (root) VirtualMap state / brave-family-remember-vast | |||||||||
| node1 | 17.921s | 2025-11-17 21:06:43.908 | 107 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@2edb15dc | |
| node1 | 17.923s | 2025-11-17 21:06:43.910 | 108 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 1 Timestamp: 2025-11-17T21:06:41.114955817Z Next consensus number: 1 Legacy running event hash: 736a07880730c900e8894d811dc50c09c9924dd9abc363def2e48e6884b33728e02345e1f0cdc99d8ab0f3e191b3555c Legacy running event mnemonic: hood-valve-toward-theme Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: b726ad6c45ad078d1e4654376ff5be4104a21267b9439b8e6c9e64bf0b046160fac41ffacfe62d839d9881af121767bb (root) VirtualMap state / brave-family-remember-vast | |||||||||
| node3 | 17.949s | 2025-11-17 21:06:43.936 | 107 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+06+41.067149334Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 17.950s | 2025-11-17 21:06:43.937 | 108 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+06+41.067149334Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 17.950s | 2025-11-17 21:06:43.937 | 109 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 17.951s | 2025-11-17 21:06:43.938 | 110 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 17.957s | 2025-11-17 21:06:43.944 | 111 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 17.961s | 2025-11-17 21:06:43.948 | 109 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+06+41.121642458Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 17.961s | 2025-11-17 21:06:43.948 | 110 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+06+41.121642458Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 17.961s | 2025-11-17 21:06:43.948 | 111 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 17.962s | 2025-11-17 21:06:43.949 | 112 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 17.968s | 2025-11-17 21:06:43.955 | 113 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 17.974s | 2025-11-17 21:06:43.961 | 108 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@42091eea | |
| node4 | 17.978s | 2025-11-17 21:06:43.965 | 109 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 1 Timestamp: 2025-11-17T21:06:41.114955817Z Next consensus number: 1 Legacy running event hash: 736a07880730c900e8894d811dc50c09c9924dd9abc363def2e48e6884b33728e02345e1f0cdc99d8ab0f3e191b3555c Legacy running event mnemonic: hood-valve-toward-theme Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: b726ad6c45ad078d1e4654376ff5be4104a21267b9439b8e6c9e64bf0b046160fac41ffacfe62d839d9881af121767bb (root) VirtualMap state / brave-family-remember-vast | |||||||||
| node1 | 17.980s | 2025-11-17 21:06:43.967 | 115 | INFO | PLATFORM_STATUS | <platformForkJoinThread-7> | StatusStateMachine: | Platform spent 2.7 s in CHECKING. Now in ACTIVE | |
| node0 | 18.000s | 2025-11-17 21:06:43.987 | 108 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@5d67603c | |
| node0 | 18.003s | 2025-11-17 21:06:43.990 | 109 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 1 Timestamp: 2025-11-17T21:06:41.114955817Z Next consensus number: 1 Legacy running event hash: 736a07880730c900e8894d811dc50c09c9924dd9abc363def2e48e6884b33728e02345e1f0cdc99d8ab0f3e191b3555c Legacy running event mnemonic: hood-valve-toward-theme Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: b726ad6c45ad078d1e4654376ff5be4104a21267b9439b8e6c9e64bf0b046160fac41ffacfe62d839d9881af121767bb (root) VirtualMap state / brave-family-remember-vast | |||||||||
| node4 | 18.017s | 2025-11-17 21:06:44.004 | 112 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+06+41.140869703Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 18.017s | 2025-11-17 21:06:44.004 | 113 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+06+41.140869703Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 18.018s | 2025-11-17 21:06:44.005 | 114 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 18.019s | 2025-11-17 21:06:44.006 | 115 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 18.024s | 2025-11-17 21:06:44.011 | 116 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 18.041s | 2025-11-17 21:06:44.028 | 110 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+06+40.946965742Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 18.041s | 2025-11-17 21:06:44.028 | 111 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+06+40.946965742Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 18.042s | 2025-11-17 21:06:44.029 | 112 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 18.043s | 2025-11-17 21:06:44.030 | 113 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 18.047s | 2025-11-17 21:06:44.034 | 107 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@5d67603c | |
| node0 | 18.048s | 2025-11-17 21:06:44.035 | 114 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 18.050s | 2025-11-17 21:06:44.037 | 108 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 1 Timestamp: 2025-11-17T21:06:41.114955817Z Next consensus number: 1 Legacy running event hash: 736a07880730c900e8894d811dc50c09c9924dd9abc363def2e48e6884b33728e02345e1f0cdc99d8ab0f3e191b3555c Legacy running event mnemonic: hood-valve-toward-theme Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: b726ad6c45ad078d1e4654376ff5be4104a21267b9439b8e6c9e64bf0b046160fac41ffacfe62d839d9881af121767bb (root) VirtualMap state / brave-family-remember-vast | |||||||||
| node3 | 18.087s | 2025-11-17 21:06:44.074 | 113 | INFO | PLATFORM_STATUS | <platformForkJoinThread-8> | StatusStateMachine: | Platform spent 1.9 s in CHECKING. Now in ACTIVE | |
| node2 | 18.094s | 2025-11-17 21:06:44.081 | 109 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+06+41.228512734Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 18.095s | 2025-11-17 21:06:44.082 | 110 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+06+41.228512734Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 18.096s | 2025-11-17 21:06:44.083 | 111 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 18.097s | 2025-11-17 21:06:44.084 | 112 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 18.103s | 2025-11-17 21:06:44.090 | 113 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 18.545s | 2025-11-17 21:06:44.532 | 122 | INFO | PLATFORM_STATUS | <platformForkJoinThread-2> | StatusStateMachine: | Platform spent 10.1 s in OBSERVING. Now in CHECKING | |
| node4 | 19.156s | 2025-11-17 21:06:45.143 | 136 | INFO | PLATFORM_STATUS | <platformForkJoinThread-5> | StatusStateMachine: | Platform spent 2.0 s in CHECKING. Now in ACTIVE | |
| node2 | 19.925s | 2025-11-17 21:06:45.912 | 156 | INFO | PLATFORM_STATUS | <platformForkJoinThread-5> | StatusStateMachine: | Platform spent 1.4 s in CHECKING. Now in ACTIVE | |
| node2 | 35.185s | 2025-11-17 21:07:01.172 | 482 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 38 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 35.186s | 2025-11-17 21:07:01.173 | 464 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 38 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 35.328s | 2025-11-17 21:07:01.315 | 479 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 38 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node4 | 35.356s | 2025-11-17 21:07:01.343 | 474 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 38 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node0 | 35.368s | 2025-11-17 21:07:01.355 | 479 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 38 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 35.370s | 2025-11-17 21:07:01.357 | 482 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 38 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/38 | |
| node1 | 35.371s | 2025-11-17 21:07:01.358 | 483 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@54f598ec | |
| node0 | 35.435s | 2025-11-17 21:07:01.422 | 502 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 38 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/38 | |
| node0 | 35.437s | 2025-11-17 21:07:01.424 | 503 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@2e8ad9f | |
| node3 | 35.441s | 2025-11-17 21:07:01.428 | 470 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 38 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/38 | |
| node3 | 35.441s | 2025-11-17 21:07:01.428 | 471 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@56f6a9a6 | |
| node1 | 35.450s | 2025-11-17 21:07:01.437 | 540 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@54f598ec | |
| node1 | 35.452s | 2025-11-17 21:07:01.439 | 541 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 38 Timestamp: 2025-11-17T21:07:00.040668Z Next consensus number: 1299 Legacy running event hash: e4c3f73b9ff6dbb043329d113cd7bab5fa57a05350ea0da26df36843cf19690e8625e6016b7070cf182da7dc4475037c Legacy running event mnemonic: guilt-board-access-meat Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1217122809 Root hash: a293e1089ac0da6ee764ff0d5e82a4ac6ef38366a511f2920118182806d01c748147dd7a800e2a2e1ed7eba1aad00781 (root) VirtualMap state / pumpkin-tenant-pool-rather | |||||||||
| node1 | 35.460s | 2025-11-17 21:07:01.447 | 542 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+06+41.121642458Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 35.461s | 2025-11-17 21:07:01.448 | 543 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 11 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+06+41.121642458Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 35.461s | 2025-11-17 21:07:01.448 | 544 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 35.463s | 2025-11-17 21:07:01.450 | 545 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 35.463s | 2025-11-17 21:07:01.450 | 546 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 38 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/38 {"round":38,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/38/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 35.514s | 2025-11-17 21:07:01.501 | 477 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 38 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/38 | |
| node4 | 35.515s | 2025-11-17 21:07:01.502 | 478 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@c3e7f6e | |
| node0 | 35.522s | 2025-11-17 21:07:01.509 | 534 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@2e8ad9f | |
| node0 | 35.524s | 2025-11-17 21:07:01.511 | 535 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 38 Timestamp: 2025-11-17T21:07:00.040668Z Next consensus number: 1299 Legacy running event hash: e4c3f73b9ff6dbb043329d113cd7bab5fa57a05350ea0da26df36843cf19690e8625e6016b7070cf182da7dc4475037c Legacy running event mnemonic: guilt-board-access-meat Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1217122809 Root hash: a293e1089ac0da6ee764ff0d5e82a4ac6ef38366a511f2920118182806d01c748147dd7a800e2a2e1ed7eba1aad00781 (root) VirtualMap state / pumpkin-tenant-pool-rather | |||||||||
| node3 | 35.525s | 2025-11-17 21:07:01.512 | 520 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@56f6a9a6 | |
| node3 | 35.527s | 2025-11-17 21:07:01.514 | 529 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 38 Timestamp: 2025-11-17T21:07:00.040668Z Next consensus number: 1299 Legacy running event hash: e4c3f73b9ff6dbb043329d113cd7bab5fa57a05350ea0da26df36843cf19690e8625e6016b7070cf182da7dc4475037c Legacy running event mnemonic: guilt-board-access-meat Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1217122809 Root hash: a293e1089ac0da6ee764ff0d5e82a4ac6ef38366a511f2920118182806d01c748147dd7a800e2a2e1ed7eba1aad00781 (root) VirtualMap state / pumpkin-tenant-pool-rather | |||||||||
| node0 | 35.533s | 2025-11-17 21:07:01.520 | 536 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+06+40.946965742Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 35.533s | 2025-11-17 21:07:01.520 | 537 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 11 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+06+40.946965742Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 35.534s | 2025-11-17 21:07:01.521 | 538 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 35.535s | 2025-11-17 21:07:01.522 | 539 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 35.536s | 2025-11-17 21:07:01.523 | 540 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 38 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/38 {"round":38,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/38/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 35.537s | 2025-11-17 21:07:01.524 | 530 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+06+41.067149334Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 35.538s | 2025-11-17 21:07:01.525 | 531 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 11 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+06+41.067149334Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 35.538s | 2025-11-17 21:07:01.525 | 532 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 35.540s | 2025-11-17 21:07:01.527 | 533 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 35.540s | 2025-11-17 21:07:01.527 | 534 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 38 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/38 {"round":38,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/38/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 35.544s | 2025-11-17 21:07:01.531 | 509 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 38 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/38 | |
| node2 | 35.545s | 2025-11-17 21:07:01.532 | 510 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@2e8ad9f | |
| node4 | 35.608s | 2025-11-17 21:07:01.595 | 513 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@c3e7f6e | |
| node4 | 35.610s | 2025-11-17 21:07:01.597 | 514 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 38 Timestamp: 2025-11-17T21:07:00.040668Z Next consensus number: 1299 Legacy running event hash: e4c3f73b9ff6dbb043329d113cd7bab5fa57a05350ea0da26df36843cf19690e8625e6016b7070cf182da7dc4475037c Legacy running event mnemonic: guilt-board-access-meat Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1217122809 Root hash: a293e1089ac0da6ee764ff0d5e82a4ac6ef38366a511f2920118182806d01c748147dd7a800e2a2e1ed7eba1aad00781 (root) VirtualMap state / pumpkin-tenant-pool-rather | |||||||||
| node4 | 35.618s | 2025-11-17 21:07:01.605 | 515 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+06+41.140869703Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 35.619s | 2025-11-17 21:07:01.606 | 516 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 11 File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+06+41.140869703Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 35.619s | 2025-11-17 21:07:01.606 | 517 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 35.621s | 2025-11-17 21:07:01.608 | 518 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 35.621s | 2025-11-17 21:07:01.608 | 519 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 38 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/38 {"round":38,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/38/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 35.668s | 2025-11-17 21:07:01.655 | 549 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@2e8ad9f | |
| node2 | 35.671s | 2025-11-17 21:07:01.658 | 551 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 38 Timestamp: 2025-11-17T21:07:00.040668Z Next consensus number: 1299 Legacy running event hash: e4c3f73b9ff6dbb043329d113cd7bab5fa57a05350ea0da26df36843cf19690e8625e6016b7070cf182da7dc4475037c Legacy running event mnemonic: guilt-board-access-meat Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1217122809 Root hash: a293e1089ac0da6ee764ff0d5e82a4ac6ef38366a511f2920118182806d01c748147dd7a800e2a2e1ed7eba1aad00781 (root) VirtualMap state / pumpkin-tenant-pool-rather | |||||||||
| node2 | 35.680s | 2025-11-17 21:07:01.667 | 554 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+06+41.228512734Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 35.680s | 2025-11-17 21:07:01.667 | 555 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 11 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+06+41.228512734Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 35.681s | 2025-11-17 21:07:01.668 | 556 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 35.682s | 2025-11-17 21:07:01.669 | 557 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 35.683s | 2025-11-17 21:07:01.670 | 558 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 38 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/38 {"round":38,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/38/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 1m 35.050s | 2025-11-17 21:08:01.037 | 2003 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 174 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 1m 35.050s | 2025-11-17 21:08:01.037 | 1985 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 174 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 1m 35.061s | 2025-11-17 21:08:01.048 | 2022 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 174 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node4 | 1m 35.107s | 2025-11-17 21:08:01.094 | 1980 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 174 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 1m 35.141s | 2025-11-17 21:08:01.128 | 2001 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 174 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 1m 35.217s | 2025-11-17 21:08:01.204 | 2024 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 174 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/174 | |
| node1 | 1m 35.217s | 2025-11-17 21:08:01.204 | 2025 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@47c1fc16 | |
| node0 | 1m 35.247s | 2025-11-17 21:08:01.234 | 2022 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 174 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/174 | |
| node0 | 1m 35.248s | 2025-11-17 21:08:01.235 | 2023 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@387ead84 | |
| node3 | 1m 35.253s | 2025-11-17 21:08:01.240 | 1988 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 174 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/174 | |
| node3 | 1m 35.254s | 2025-11-17 21:08:01.241 | 1989 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@579ebf02 | |
| node1 | 1m 35.326s | 2025-11-17 21:08:01.313 | 2056 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@47c1fc16 | |
| node1 | 1m 35.328s | 2025-11-17 21:08:01.315 | 2057 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 174 Timestamp: 2025-11-17T21:08:00.118498Z Next consensus number: 6121 Legacy running event hash: 8b64fae2169795e153f6d15ef858c07a998acb39a80f884f2d212fe7000a5a96a2a5ed7d4d1a528c7a64803cbc295d36 Legacy running event mnemonic: near-phrase-spray-dumb Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1292972803 Root hash: e7e247fcf1bfcdec9bccd170c1d26ef96c8d3c0c28e02b9d59592c0958d103bec6370e2a9b74cec07a998bd0a3178af8 (root) VirtualMap state / adult-zebra-sight-dream | |||||||||
| node1 | 1m 35.335s | 2025-11-17 21:08:01.322 | 2058 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+06+41.121642458Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 1m 35.335s | 2025-11-17 21:08:01.322 | 2059 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 146 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+06+41.121642458Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 1m 35.335s | 2025-11-17 21:08:01.322 | 2060 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 1m 35.340s | 2025-11-17 21:08:01.327 | 2061 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 1m 35.340s | 2025-11-17 21:08:01.327 | 2046 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@579ebf02 | |
| node1 | 1m 35.341s | 2025-11-17 21:08:01.328 | 2062 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 174 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/174 {"round":174,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/174/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 1m 35.342s | 2025-11-17 21:08:01.329 | 2047 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 174 Timestamp: 2025-11-17T21:08:00.118498Z Next consensus number: 6121 Legacy running event hash: 8b64fae2169795e153f6d15ef858c07a998acb39a80f884f2d212fe7000a5a96a2a5ed7d4d1a528c7a64803cbc295d36 Legacy running event mnemonic: near-phrase-spray-dumb Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1292972803 Root hash: e7e247fcf1bfcdec9bccd170c1d26ef96c8d3c0c28e02b9d59592c0958d103bec6370e2a9b74cec07a998bd0a3178af8 (root) VirtualMap state / adult-zebra-sight-dream | |||||||||
| node0 | 1m 35.344s | 2025-11-17 21:08:01.331 | 2054 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@387ead84 | |
| node0 | 1m 35.346s | 2025-11-17 21:08:01.333 | 2055 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 174 Timestamp: 2025-11-17T21:08:00.118498Z Next consensus number: 6121 Legacy running event hash: 8b64fae2169795e153f6d15ef858c07a998acb39a80f884f2d212fe7000a5a96a2a5ed7d4d1a528c7a64803cbc295d36 Legacy running event mnemonic: near-phrase-spray-dumb Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1292972803 Root hash: e7e247fcf1bfcdec9bccd170c1d26ef96c8d3c0c28e02b9d59592c0958d103bec6370e2a9b74cec07a998bd0a3178af8 (root) VirtualMap state / adult-zebra-sight-dream | |||||||||
| node3 | 1m 35.350s | 2025-11-17 21:08:01.337 | 2048 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+06+41.067149334Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 1m 35.350s | 2025-11-17 21:08:01.337 | 2049 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 146 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+06+41.067149334Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 1m 35.351s | 2025-11-17 21:08:01.338 | 2050 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 1m 35.355s | 2025-11-17 21:08:01.342 | 2056 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+06+40.946965742Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 1m 35.355s | 2025-11-17 21:08:01.342 | 2051 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 1m 35.356s | 2025-11-17 21:08:01.343 | 2057 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 146 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+06+40.946965742Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 1m 35.356s | 2025-11-17 21:08:01.343 | 2058 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 1m 35.356s | 2025-11-17 21:08:01.343 | 2052 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 174 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/174 {"round":174,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/174/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 1m 35.360s | 2025-11-17 21:08:01.347 | 2059 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 1m 35.361s | 2025-11-17 21:08:01.348 | 2060 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 174 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/174 {"round":174,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/174/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 1m 35.389s | 2025-11-17 21:08:01.376 | 1999 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 174 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/174 | |
| node4 | 1m 35.390s | 2025-11-17 21:08:01.377 | 2000 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@5d9b2042 | |
| node2 | 1m 35.421s | 2025-11-17 21:08:01.408 | 2041 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 174 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/174 | |
| node2 | 1m 35.422s | 2025-11-17 21:08:01.409 | 2042 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@63f45c6b | |
| node4 | 1m 35.473s | 2025-11-17 21:08:01.460 | 2039 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@5d9b2042 | |
| node4 | 1m 35.475s | 2025-11-17 21:08:01.462 | 2040 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 174 Timestamp: 2025-11-17T21:08:00.118498Z Next consensus number: 6121 Legacy running event hash: 8b64fae2169795e153f6d15ef858c07a998acb39a80f884f2d212fe7000a5a96a2a5ed7d4d1a528c7a64803cbc295d36 Legacy running event mnemonic: near-phrase-spray-dumb Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1292972803 Root hash: e7e247fcf1bfcdec9bccd170c1d26ef96c8d3c0c28e02b9d59592c0958d103bec6370e2a9b74cec07a998bd0a3178af8 (root) VirtualMap state / adult-zebra-sight-dream | |||||||||
| node4 | 1m 35.483s | 2025-11-17 21:08:01.470 | 2041 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+06+41.140869703Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 1m 35.483s | 2025-11-17 21:08:01.470 | 2042 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 146 File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+06+41.140869703Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 1m 35.484s | 2025-11-17 21:08:01.471 | 2043 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 1m 35.488s | 2025-11-17 21:08:01.475 | 2044 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 1m 35.489s | 2025-11-17 21:08:01.476 | 2045 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 174 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/174 {"round":174,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/174/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 1m 35.520s | 2025-11-17 21:08:01.507 | 2081 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@63f45c6b | |
| node2 | 1m 35.522s | 2025-11-17 21:08:01.509 | 2082 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 174 Timestamp: 2025-11-17T21:08:00.118498Z Next consensus number: 6121 Legacy running event hash: 8b64fae2169795e153f6d15ef858c07a998acb39a80f884f2d212fe7000a5a96a2a5ed7d4d1a528c7a64803cbc295d36 Legacy running event mnemonic: near-phrase-spray-dumb Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1292972803 Root hash: e7e247fcf1bfcdec9bccd170c1d26ef96c8d3c0c28e02b9d59592c0958d103bec6370e2a9b74cec07a998bd0a3178af8 (root) VirtualMap state / adult-zebra-sight-dream | |||||||||
| node2 | 1m 35.531s | 2025-11-17 21:08:01.518 | 2083 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+06+41.228512734Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 1m 35.531s | 2025-11-17 21:08:01.518 | 2084 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 146 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+06+41.228512734Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 1m 35.531s | 2025-11-17 21:08:01.518 | 2085 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 1m 35.536s | 2025-11-17 21:08:01.523 | 2086 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 1m 35.536s | 2025-11-17 21:08:01.523 | 2087 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 174 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/174 {"round":174,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/174/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 2m 35.090s | 2025-11-17 21:09:01.077 | 3538 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 309 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node4 | 2m 35.231s | 2025-11-17 21:09:01.218 | 3491 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 309 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 2m 35.235s | 2025-11-17 21:09:01.222 | 3538 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 309 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 2m 35.274s | 2025-11-17 21:09:01.261 | 3563 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 309 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 2m 35.383s | 2025-11-17 21:09:01.370 | 3498 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 309 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 2m 35.419s | 2025-11-17 21:09:01.406 | 3566 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 309 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/309 | |
| node2 | 2m 35.420s | 2025-11-17 21:09:01.407 | 3567 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@34ba6e95 | |
| node3 | 2m 35.449s | 2025-11-17 21:09:01.436 | 3502 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 309 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/309 | |
| node3 | 2m 35.450s | 2025-11-17 21:09:01.437 | 3505 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@7a259340 | |
| node0 | 2m 35.490s | 2025-11-17 21:09:01.477 | 3541 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 309 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/309 | |
| node1 | 2m 35.490s | 2025-11-17 21:09:01.477 | 3541 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 309 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/309 | |
| node1 | 2m 35.490s | 2025-11-17 21:09:01.477 | 3542 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@67ae57da | |
| node0 | 2m 35.491s | 2025-11-17 21:09:01.478 | 3542 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@e81daa6 | |
| node2 | 2m 35.506s | 2025-11-17 21:09:01.493 | 3606 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@34ba6e95 | |
| node2 | 2m 35.508s | 2025-11-17 21:09:01.495 | 3607 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 309 Timestamp: 2025-11-17T21:09:00.190305213Z Next consensus number: 10898 Legacy running event hash: 28282c7a1356e3441ed8a979521f15357c67be794c7bfa6593febb60908c1d7f8c594ab462953e6ab631678c1b8fe1ea Legacy running event mnemonic: agree-monster-hello-nurse Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -617936686 Root hash: f29f0997b970a387e63a252b6440e70e3ef2e3dae051a9d44bafd4f741150dded73917af4d57757a24ee164ac28757b1 (root) VirtualMap state / promote-cream-spirit-monkey | |||||||||
| node2 | 2m 35.516s | 2025-11-17 21:09:01.503 | 3608 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+06+41.228512734Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 2m 35.516s | 2025-11-17 21:09:01.503 | 3609 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 282 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+06+41.228512734Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 2m 35.516s | 2025-11-17 21:09:01.503 | 3610 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 2m 35.524s | 2025-11-17 21:09:01.511 | 3614 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 2m 35.525s | 2025-11-17 21:09:01.512 | 3615 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 309 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/309 {"round":309,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/309/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 2m 35.531s | 2025-11-17 21:09:01.518 | 3548 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@7a259340 | |
| node3 | 2m 35.533s | 2025-11-17 21:09:01.520 | 3549 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 309 Timestamp: 2025-11-17T21:09:00.190305213Z Next consensus number: 10898 Legacy running event hash: 28282c7a1356e3441ed8a979521f15357c67be794c7bfa6593febb60908c1d7f8c594ab462953e6ab631678c1b8fe1ea Legacy running event mnemonic: agree-monster-hello-nurse Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -617936686 Root hash: f29f0997b970a387e63a252b6440e70e3ef2e3dae051a9d44bafd4f741150dded73917af4d57757a24ee164ac28757b1 (root) VirtualMap state / promote-cream-spirit-monkey | |||||||||
| node3 | 2m 35.540s | 2025-11-17 21:09:01.527 | 3550 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+06+41.067149334Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 2m 35.540s | 2025-11-17 21:09:01.527 | 3551 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 282 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+06+41.067149334Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 2m 35.541s | 2025-11-17 21:09:01.528 | 3552 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 2m 35.548s | 2025-11-17 21:09:01.535 | 3553 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 2m 35.549s | 2025-11-17 21:09:01.536 | 3554 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 309 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/309 {"round":309,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/309/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 2m 35.578s | 2025-11-17 21:09:01.565 | 3573 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@e81daa6 | |
| node1 | 2m 35.579s | 2025-11-17 21:09:01.566 | 3576 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@67ae57da | |
| node0 | 2m 35.580s | 2025-11-17 21:09:01.567 | 3575 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 309 Timestamp: 2025-11-17T21:09:00.190305213Z Next consensus number: 10898 Legacy running event hash: 28282c7a1356e3441ed8a979521f15357c67be794c7bfa6593febb60908c1d7f8c594ab462953e6ab631678c1b8fe1ea Legacy running event mnemonic: agree-monster-hello-nurse Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -617936686 Root hash: f29f0997b970a387e63a252b6440e70e3ef2e3dae051a9d44bafd4f741150dded73917af4d57757a24ee164ac28757b1 (root) VirtualMap state / promote-cream-spirit-monkey | |||||||||
| node1 | 2m 35.581s | 2025-11-17 21:09:01.568 | 3577 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 309 Timestamp: 2025-11-17T21:09:00.190305213Z Next consensus number: 10898 Legacy running event hash: 28282c7a1356e3441ed8a979521f15357c67be794c7bfa6593febb60908c1d7f8c594ab462953e6ab631678c1b8fe1ea Legacy running event mnemonic: agree-monster-hello-nurse Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -617936686 Root hash: f29f0997b970a387e63a252b6440e70e3ef2e3dae051a9d44bafd4f741150dded73917af4d57757a24ee164ac28757b1 (root) VirtualMap state / promote-cream-spirit-monkey | |||||||||
| node0 | 2m 35.588s | 2025-11-17 21:09:01.575 | 3578 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+06+40.946965742Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 2m 35.588s | 2025-11-17 21:09:01.575 | 3579 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 282 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+06+40.946965742Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 2m 35.588s | 2025-11-17 21:09:01.575 | 3580 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 2m 35.590s | 2025-11-17 21:09:01.577 | 3578 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+06+41.121642458Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 2m 35.590s | 2025-11-17 21:09:01.577 | 3579 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 282 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+06+41.121642458Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 2m 35.590s | 2025-11-17 21:09:01.577 | 3580 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 2m 35.596s | 2025-11-17 21:09:01.583 | 3581 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 2m 35.596s | 2025-11-17 21:09:01.583 | 3582 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 309 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/309 {"round":309,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/309/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 2m 35.598s | 2025-11-17 21:09:01.585 | 3591 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 2m 35.598s | 2025-11-17 21:09:01.585 | 3592 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 309 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/309 {"round":309,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/309/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 2m 35.603s | 2025-11-17 21:09:01.590 | 3494 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 309 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/309 | |
| node4 | 2m 35.603s | 2025-11-17 21:09:01.590 | 3496 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@73d87fc2 | |
| node4 | 2m 35.685s | 2025-11-17 21:09:01.672 | 3541 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@73d87fc2 | |
| node4 | 2m 35.687s | 2025-11-17 21:09:01.674 | 3542 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 309 Timestamp: 2025-11-17T21:09:00.190305213Z Next consensus number: 10898 Legacy running event hash: 28282c7a1356e3441ed8a979521f15357c67be794c7bfa6593febb60908c1d7f8c594ab462953e6ab631678c1b8fe1ea Legacy running event mnemonic: agree-monster-hello-nurse Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -617936686 Root hash: f29f0997b970a387e63a252b6440e70e3ef2e3dae051a9d44bafd4f741150dded73917af4d57757a24ee164ac28757b1 (root) VirtualMap state / promote-cream-spirit-monkey | |||||||||
| node4 | 2m 35.694s | 2025-11-17 21:09:01.681 | 3543 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+06+41.140869703Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 2m 35.694s | 2025-11-17 21:09:01.681 | 3544 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 282 File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+06+41.140869703Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 2m 35.695s | 2025-11-17 21:09:01.682 | 3545 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 2m 35.702s | 2025-11-17 21:09:01.689 | 3546 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 2m 35.703s | 2025-11-17 21:09:01.690 | 3547 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 309 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/309 {"round":309,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/309/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 3m 15.446s | 2025-11-17 21:09:41.433 | 4525 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 1 to 4>> | NetworkUtils: | Connection broken: 1 -> 4 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:09:41.432663861Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node3 | 3m 15.446s | 2025-11-17 21:09:41.433 | 4479 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 3 to 4>> | NetworkUtils: | Connection broken: 3 -> 4 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:09:41.431936060Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node0 | 3m 15.447s | 2025-11-17 21:09:41.434 | 4515 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 0 to 4>> | NetworkUtils: | Connection broken: 0 -> 4 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:09:41.433040589Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node2 | 3m 15.448s | 2025-11-17 21:09:41.435 | 4554 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 2 to 4>> | NetworkUtils: | Connection broken: 2 -> 4 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:09:41.431951574Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node2 | 3m 35.149s | 2025-11-17 21:10:01.136 | 5108 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 441 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 3m 35.161s | 2025-11-17 21:10:01.148 | 5031 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 441 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 3m 35.169s | 2025-11-17 21:10:01.156 | 4981 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 441 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node0 | 3m 35.230s | 2025-11-17 21:10:01.217 | 5023 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 441 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 3m 35.308s | 2025-11-17 21:10:01.295 | 4984 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 441 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/441 | |
| node3 | 3m 35.309s | 2025-11-17 21:10:01.296 | 4985 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for com.swirlds.demo.consistency.ConsistencyTestingToolState@666f94bf | |
| node0 | 3m 35.362s | 2025-11-17 21:10:01.349 | 5026 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 441 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/441 | |
| node0 | 3m 35.363s | 2025-11-17 21:10:01.350 | 5027 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for com.swirlds.demo.consistency.ConsistencyTestingToolState@8aa305 | |
| node1 | 3m 35.378s | 2025-11-17 21:10:01.365 | 5034 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 441 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/441 | |
| node1 | 3m 35.379s | 2025-11-17 21:10:01.366 | 5035 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for com.swirlds.demo.consistency.ConsistencyTestingToolState@5700ef47 | |
| node3 | 3m 35.386s | 2025-11-17 21:10:01.373 | 5034 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for com.swirlds.demo.consistency.ConsistencyTestingToolState@666f94bf | |
| node3 | 3m 35.388s | 2025-11-17 21:10:01.375 | 5035 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 441 Timestamp: 2025-11-17T21:10:00.282257Z Next consensus number: 15236 Legacy running event hash: bb079ba568d68a2fb8cccb576c7d081170641a12d3925e61fa15f0cd80a6e8150dd3d5d4b56a18e069d904c3e591b19f Legacy running event mnemonic: palace-merge-bracket-riot Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -992708095 Root hash: 4bcbb388a4a091182938fefa53228ad5f56c272b1f519d7ff970162fa854052839d1dcc06ceae5e891475e22b79956aa (root) VirtualMap state / just-monitor-hire-tackle | |||||||||
| node3 | 3m 35.395s | 2025-11-17 21:10:01.382 | 5036 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+06+41.067149334Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 3m 35.396s | 2025-11-17 21:10:01.383 | 5037 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 414 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+06+41.067149334Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 3m 35.396s | 2025-11-17 21:10:01.383 | 5038 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 3m 35.406s | 2025-11-17 21:10:01.393 | 5039 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 3m 35.407s | 2025-11-17 21:10:01.394 | 5040 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 441 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/441 {"round":441,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/441/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 3m 35.447s | 2025-11-17 21:10:01.434 | 5066 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for com.swirlds.demo.consistency.ConsistencyTestingToolState@8aa305 | |
| node0 | 3m 35.448s | 2025-11-17 21:10:01.435 | 5067 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 441 Timestamp: 2025-11-17T21:10:00.282257Z Next consensus number: 15236 Legacy running event hash: bb079ba568d68a2fb8cccb576c7d081170641a12d3925e61fa15f0cd80a6e8150dd3d5d4b56a18e069d904c3e591b19f Legacy running event mnemonic: palace-merge-bracket-riot Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -992708095 Root hash: 4bcbb388a4a091182938fefa53228ad5f56c272b1f519d7ff970162fa854052839d1dcc06ceae5e891475e22b79956aa (root) VirtualMap state / just-monitor-hire-tackle | |||||||||
| node1 | 3m 35.454s | 2025-11-17 21:10:01.441 | 5074 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for com.swirlds.demo.consistency.ConsistencyTestingToolState@5700ef47 | |
| node0 | 3m 35.455s | 2025-11-17 21:10:01.442 | 5068 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+06+40.946965742Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 3m 35.455s | 2025-11-17 21:10:01.442 | 5069 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 414 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+06+40.946965742Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 3m 35.456s | 2025-11-17 21:10:01.443 | 5070 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 3m 35.456s | 2025-11-17 21:10:01.443 | 5075 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 441 Timestamp: 2025-11-17T21:10:00.282257Z Next consensus number: 15236 Legacy running event hash: bb079ba568d68a2fb8cccb576c7d081170641a12d3925e61fa15f0cd80a6e8150dd3d5d4b56a18e069d904c3e591b19f Legacy running event mnemonic: palace-merge-bracket-riot Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -992708095 Root hash: 4bcbb388a4a091182938fefa53228ad5f56c272b1f519d7ff970162fa854052839d1dcc06ceae5e891475e22b79956aa (root) VirtualMap state / just-monitor-hire-tackle | |||||||||
| node1 | 3m 35.462s | 2025-11-17 21:10:01.449 | 5076 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+06+41.121642458Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 3m 35.462s | 2025-11-17 21:10:01.449 | 5077 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 414 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+06+41.121642458Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 3m 35.462s | 2025-11-17 21:10:01.449 | 5078 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 3m 35.466s | 2025-11-17 21:10:01.453 | 5071 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 3m 35.466s | 2025-11-17 21:10:01.453 | 5072 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 441 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/441 {"round":441,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/441/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 3m 35.473s | 2025-11-17 21:10:01.460 | 5079 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 3m 35.473s | 2025-11-17 21:10:01.460 | 5080 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 441 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/441 {"round":441,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/441/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 3m 35.518s | 2025-11-17 21:10:01.505 | 5121 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 441 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/441 | |
| node2 | 3m 35.519s | 2025-11-17 21:10:01.506 | 5122 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for com.swirlds.demo.consistency.ConsistencyTestingToolState@f2c0628 | |
| node2 | 3m 35.601s | 2025-11-17 21:10:01.588 | 5156 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for com.swirlds.demo.consistency.ConsistencyTestingToolState@f2c0628 | |
| node2 | 3m 35.603s | 2025-11-17 21:10:01.590 | 5157 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 441 Timestamp: 2025-11-17T21:10:00.282257Z Next consensus number: 15236 Legacy running event hash: bb079ba568d68a2fb8cccb576c7d081170641a12d3925e61fa15f0cd80a6e8150dd3d5d4b56a18e069d904c3e591b19f Legacy running event mnemonic: palace-merge-bracket-riot Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -992708095 Root hash: 4bcbb388a4a091182938fefa53228ad5f56c272b1f519d7ff970162fa854052839d1dcc06ceae5e891475e22b79956aa (root) VirtualMap state / just-monitor-hire-tackle | |||||||||
| node2 | 3m 35.610s | 2025-11-17 21:10:01.597 | 5158 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+06+41.228512734Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 3m 35.610s | 2025-11-17 21:10:01.597 | 5159 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 414 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+06+41.228512734Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 3m 35.610s | 2025-11-17 21:10:01.597 | 5160 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 3m 35.621s | 2025-11-17 21:10:01.608 | 5161 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 3m 35.621s | 2025-11-17 21:10:01.608 | 5162 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 441 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/441 {"round":441,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/441/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 4m 34.901s | 2025-11-17 21:11:00.888 | 6601 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 579 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node0 | 4m 34.919s | 2025-11-17 21:11:00.906 | 6587 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 579 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 4m 34.959s | 2025-11-17 21:11:00.946 | 6553 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 579 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 4m 34.989s | 2025-11-17 21:11:00.976 | 6818 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 579 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 4m 35.089s | 2025-11-17 21:11:01.076 | 6821 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 579 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/579 | |
| node2 | 4m 35.090s | 2025-11-17 21:11:01.077 | 6822 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for com.swirlds.demo.consistency.ConsistencyTestingToolState@dc48cd5 | |
| node0 | 4m 35.134s | 2025-11-17 21:11:01.121 | 6590 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 579 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/579 | |
| node0 | 4m 35.135s | 2025-11-17 21:11:01.122 | 6591 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for com.swirlds.demo.consistency.ConsistencyTestingToolState@416ee21 | |
| node3 | 4m 35.147s | 2025-11-17 21:11:01.134 | 6556 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 579 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/579 | |
| node3 | 4m 35.147s | 2025-11-17 21:11:01.134 | 6557 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for com.swirlds.demo.consistency.ConsistencyTestingToolState@375974fc | |
| node1 | 4m 35.159s | 2025-11-17 21:11:01.146 | 6604 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 579 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/579 | |
| node1 | 4m 35.160s | 2025-11-17 21:11:01.147 | 6605 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for com.swirlds.demo.consistency.ConsistencyTestingToolState@5611c0a5 | |
| node2 | 4m 35.170s | 2025-11-17 21:11:01.157 | 6853 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for com.swirlds.demo.consistency.ConsistencyTestingToolState@dc48cd5 | |
| node2 | 4m 35.172s | 2025-11-17 21:11:01.159 | 6854 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 579 Timestamp: 2025-11-17T21:11:00.040636Z Next consensus number: 18512 Legacy running event hash: 6f6deb081ef34e092836ce9ee4f0258453f69163a0b557e1c04d0b1a91a1635af10d094d161b8ab22a8c7cee6c587e90 Legacy running event mnemonic: unit-pair-taxi-scorpion Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -336406190 Root hash: 9a87ef6670028620567eed8c77a7517e01bf4dfbadd44637faa41ba403453aca1b1e3751d6bfc2e62aeb702539a582d5 (root) VirtualMap state / quality-battle-able-despair | |||||||||
| node2 | 4m 35.179s | 2025-11-17 21:11:01.166 | 6855 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+10+27.225461213Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+06+41.228512734Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 4m 35.179s | 2025-11-17 21:11:01.166 | 6856 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 552 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+10+27.225461213Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 4m 35.179s | 2025-11-17 21:11:01.166 | 6857 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 4m 35.181s | 2025-11-17 21:11:01.168 | 6858 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 4m 35.181s | 2025-11-17 21:11:01.168 | 6859 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 579 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/579 {"round":579,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/579/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 4m 35.183s | 2025-11-17 21:11:01.170 | 6860 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1 | |
| node0 | 4m 35.217s | 2025-11-17 21:11:01.204 | 6622 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for com.swirlds.demo.consistency.ConsistencyTestingToolState@416ee21 | |
| node0 | 4m 35.219s | 2025-11-17 21:11:01.206 | 6623 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 579 Timestamp: 2025-11-17T21:11:00.040636Z Next consensus number: 18512 Legacy running event hash: 6f6deb081ef34e092836ce9ee4f0258453f69163a0b557e1c04d0b1a91a1635af10d094d161b8ab22a8c7cee6c587e90 Legacy running event mnemonic: unit-pair-taxi-scorpion Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -336406190 Root hash: 9a87ef6670028620567eed8c77a7517e01bf4dfbadd44637faa41ba403453aca1b1e3751d6bfc2e62aeb702539a582d5 (root) VirtualMap state / quality-battle-able-despair | |||||||||
| node3 | 4m 35.222s | 2025-11-17 21:11:01.209 | 6588 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for com.swirlds.demo.consistency.ConsistencyTestingToolState@375974fc | |
| node3 | 4m 35.224s | 2025-11-17 21:11:01.211 | 6589 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 579 Timestamp: 2025-11-17T21:11:00.040636Z Next consensus number: 18512 Legacy running event hash: 6f6deb081ef34e092836ce9ee4f0258453f69163a0b557e1c04d0b1a91a1635af10d094d161b8ab22a8c7cee6c587e90 Legacy running event mnemonic: unit-pair-taxi-scorpion Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -336406190 Root hash: 9a87ef6670028620567eed8c77a7517e01bf4dfbadd44637faa41ba403453aca1b1e3751d6bfc2e62aeb702539a582d5 (root) VirtualMap state / quality-battle-able-despair | |||||||||
| node0 | 4m 35.225s | 2025-11-17 21:11:01.212 | 6624 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+10+27.186566834Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+06+40.946965742Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 4m 35.226s | 2025-11-17 21:11:01.213 | 6626 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 552 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+10+27.186566834Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node0 | 4m 35.226s | 2025-11-17 21:11:01.213 | 6634 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 4m 35.227s | 2025-11-17 21:11:01.214 | 6635 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 4m 35.228s | 2025-11-17 21:11:01.215 | 6636 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 579 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/579 {"round":579,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/579/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 4m 35.229s | 2025-11-17 21:11:01.216 | 6637 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1 | |
| node3 | 4m 35.230s | 2025-11-17 21:11:01.217 | 6590 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+10+27.285649301Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+06+41.067149334Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 4m 35.230s | 2025-11-17 21:11:01.217 | 6591 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 552 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+10+27.285649301Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 4m 35.230s | 2025-11-17 21:11:01.217 | 6592 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 4m 35.232s | 2025-11-17 21:11:01.219 | 6593 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 4m 35.232s | 2025-11-17 21:11:01.219 | 6594 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 579 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/579 {"round":579,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/579/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 4m 35.234s | 2025-11-17 21:11:01.221 | 6595 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1 | |
| node1 | 4m 35.241s | 2025-11-17 21:11:01.228 | 6644 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for com.swirlds.demo.consistency.ConsistencyTestingToolState@5611c0a5 | |
| node1 | 4m 35.243s | 2025-11-17 21:11:01.230 | 6645 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 579 Timestamp: 2025-11-17T21:11:00.040636Z Next consensus number: 18512 Legacy running event hash: 6f6deb081ef34e092836ce9ee4f0258453f69163a0b557e1c04d0b1a91a1635af10d094d161b8ab22a8c7cee6c587e90 Legacy running event mnemonic: unit-pair-taxi-scorpion Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -336406190 Root hash: 9a87ef6670028620567eed8c77a7517e01bf4dfbadd44637faa41ba403453aca1b1e3751d6bfc2e62aeb702539a582d5 (root) VirtualMap state / quality-battle-able-despair | |||||||||
| node1 | 4m 35.250s | 2025-11-17 21:11:01.237 | 6646 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+06+41.121642458Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+10+27.157584500Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 4m 35.250s | 2025-11-17 21:11:01.237 | 6647 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 552 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+10+27.157584500Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 4m 35.250s | 2025-11-17 21:11:01.237 | 6648 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 4m 35.252s | 2025-11-17 21:11:01.239 | 6649 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 4m 35.252s | 2025-11-17 21:11:01.239 | 6650 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 579 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/579 {"round":579,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/579/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 4m 35.253s | 2025-11-17 21:11:01.240 | 6651 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1 | |
| node3 | 5m 34.999s | 2025-11-17 21:12:00.986 | 8139 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 718 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 5m 35.031s | 2025-11-17 21:12:01.018 | 8227 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 718 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node0 | 5m 35.070s | 2025-11-17 21:12:01.057 | 8167 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 718 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 5m 35.161s | 2025-11-17 21:12:01.148 | 8402 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 718 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 5m 35.229s | 2025-11-17 21:12:01.216 | 8405 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 718 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/718 | |
| node2 | 5m 35.230s | 2025-11-17 21:12:01.217 | 8406 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for com.swirlds.demo.consistency.ConsistencyTestingToolState@165a47b3 | |
| node1 | 5m 35.283s | 2025-11-17 21:12:01.270 | 8230 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 718 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/718 | |
| node1 | 5m 35.284s | 2025-11-17 21:12:01.271 | 8231 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for com.swirlds.demo.consistency.ConsistencyTestingToolState@69661168 | |
| node2 | 5m 35.310s | 2025-11-17 21:12:01.297 | 8445 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for com.swirlds.demo.consistency.ConsistencyTestingToolState@165a47b3 | |
| node3 | 5m 35.311s | 2025-11-17 21:12:01.298 | 8142 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 718 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/718 | |
| node3 | 5m 35.311s | 2025-11-17 21:12:01.298 | 8143 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for com.swirlds.demo.consistency.ConsistencyTestingToolState@64bd9c28 | |
| node2 | 5m 35.312s | 2025-11-17 21:12:01.299 | 8446 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 718 Timestamp: 2025-11-17T21:12:00.140323102Z Next consensus number: 21833 Legacy running event hash: bcb9f215de3147946c027851fa8d5e037dccca82d5ddf2932489742cd6b3d9a37e187e401cc6d3c260fde0c2104ac020 Legacy running event mnemonic: happy-runway-delay-verb Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 369795596 Root hash: 0e535a3af5c68dad5b2081956f2e3fe48910f0ee080d3b639cb7448b28ebc8a5fc353d3c1a7b12b5e666afb4637e7915 (root) VirtualMap state / skill-elder-elder-engine | |||||||||
| node2 | 5m 35.320s | 2025-11-17 21:12:01.307 | 8447 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+10+27.225461213Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+06+41.228512734Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 5m 35.320s | 2025-11-17 21:12:01.307 | 8448 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 691 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+10+27.225461213Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 5m 35.321s | 2025-11-17 21:12:01.308 | 8449 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 5m 35.325s | 2025-11-17 21:12:01.312 | 8450 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 5m 35.325s | 2025-11-17 21:12:01.312 | 8451 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 718 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/718 {"round":718,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/718/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 5m 35.327s | 2025-11-17 21:12:01.314 | 8452 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/38 | |
| node0 | 5m 35.344s | 2025-11-17 21:12:01.331 | 8170 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 718 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/718 | |
| node0 | 5m 35.344s | 2025-11-17 21:12:01.331 | 8171 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for com.swirlds.demo.consistency.ConsistencyTestingToolState@165a47b3 | |
| node1 | 5m 35.366s | 2025-11-17 21:12:01.353 | 8262 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for com.swirlds.demo.consistency.ConsistencyTestingToolState@69661168 | |
| node1 | 5m 35.368s | 2025-11-17 21:12:01.355 | 8263 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 718 Timestamp: 2025-11-17T21:12:00.140323102Z Next consensus number: 21833 Legacy running event hash: bcb9f215de3147946c027851fa8d5e037dccca82d5ddf2932489742cd6b3d9a37e187e401cc6d3c260fde0c2104ac020 Legacy running event mnemonic: happy-runway-delay-verb Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 369795596 Root hash: 0e535a3af5c68dad5b2081956f2e3fe48910f0ee080d3b639cb7448b28ebc8a5fc353d3c1a7b12b5e666afb4637e7915 (root) VirtualMap state / skill-elder-elder-engine | |||||||||
| node1 | 5m 35.375s | 2025-11-17 21:12:01.362 | 8264 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+06+41.121642458Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+10+27.157584500Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 5m 35.375s | 2025-11-17 21:12:01.362 | 8265 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 691 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+10+27.157584500Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 5m 35.375s | 2025-11-17 21:12:01.362 | 8266 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 5m 35.379s | 2025-11-17 21:12:01.366 | 8267 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 5m 35.380s | 2025-11-17 21:12:01.367 | 8268 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 718 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/718 {"round":718,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/718/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 5m 35.381s | 2025-11-17 21:12:01.368 | 8269 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/38 | |
| node3 | 5m 35.392s | 2025-11-17 21:12:01.379 | 8182 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for com.swirlds.demo.consistency.ConsistencyTestingToolState@64bd9c28 | |
| node3 | 5m 35.394s | 2025-11-17 21:12:01.381 | 8183 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 718 Timestamp: 2025-11-17T21:12:00.140323102Z Next consensus number: 21833 Legacy running event hash: bcb9f215de3147946c027851fa8d5e037dccca82d5ddf2932489742cd6b3d9a37e187e401cc6d3c260fde0c2104ac020 Legacy running event mnemonic: happy-runway-delay-verb Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 369795596 Root hash: 0e535a3af5c68dad5b2081956f2e3fe48910f0ee080d3b639cb7448b28ebc8a5fc353d3c1a7b12b5e666afb4637e7915 (root) VirtualMap state / skill-elder-elder-engine | |||||||||
| node3 | 5m 35.400s | 2025-11-17 21:12:01.387 | 8184 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+10+27.285649301Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+06+41.067149334Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 5m 35.401s | 2025-11-17 21:12:01.388 | 8185 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 691 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+10+27.285649301Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 5m 35.401s | 2025-11-17 21:12:01.388 | 8186 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 5m 35.405s | 2025-11-17 21:12:01.392 | 8187 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 5m 35.405s | 2025-11-17 21:12:01.392 | 8188 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 718 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/718 {"round":718,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/718/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 5m 35.406s | 2025-11-17 21:12:01.393 | 8189 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/38 | |
| node0 | 5m 35.420s | 2025-11-17 21:12:01.407 | 8202 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for com.swirlds.demo.consistency.ConsistencyTestingToolState@165a47b3 | |
| node0 | 5m 35.424s | 2025-11-17 21:12:01.411 | 8203 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 718 Timestamp: 2025-11-17T21:12:00.140323102Z Next consensus number: 21833 Legacy running event hash: bcb9f215de3147946c027851fa8d5e037dccca82d5ddf2932489742cd6b3d9a37e187e401cc6d3c260fde0c2104ac020 Legacy running event mnemonic: happy-runway-delay-verb Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 369795596 Root hash: 0e535a3af5c68dad5b2081956f2e3fe48910f0ee080d3b639cb7448b28ebc8a5fc353d3c1a7b12b5e666afb4637e7915 (root) VirtualMap state / skill-elder-elder-engine | |||||||||
| node0 | 5m 35.429s | 2025-11-17 21:12:01.416 | 8212 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+10+27.186566834Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+06+40.946965742Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 5m 35.429s | 2025-11-17 21:12:01.416 | 8213 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 691 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+10+27.186566834Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node0 | 5m 35.429s | 2025-11-17 21:12:01.416 | 8214 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 5m 35.433s | 2025-11-17 21:12:01.420 | 8215 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 5m 35.434s | 2025-11-17 21:12:01.421 | 8216 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 718 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/718 {"round":718,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/718/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 5m 35.435s | 2025-11-17 21:12:01.422 | 8217 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/38 | |
| node4 | 5m 57.419s | 2025-11-17 21:12:23.406 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node4 | 5m 57.514s | 2025-11-17 21:12:23.501 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node4 | 5m 57.530s | 2025-11-17 21:12:23.517 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node4 | 5m 57.645s | 2025-11-17 21:12:23.632 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [4] are set to run locally | |
| node4 | 5m 57.676s | 2025-11-17 21:12:23.663 | 5 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node4 | 5m 59.198s | 2025-11-17 21:12:25.185 | 6 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 1522ms | |
| node4 | 5m 59.207s | 2025-11-17 21:12:25.194 | 7 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node4 | 5m 59.210s | 2025-11-17 21:12:25.197 | 8 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node4 | 5m 59.247s | 2025-11-17 21:12:25.234 | 9 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node4 | 5m 59.313s | 2025-11-17 21:12:25.300 | 10 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node4 | 5m 59.314s | 2025-11-17 21:12:25.301 | 11 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node4 | 6.003m | 2025-11-17 21:12:26.150 | 12 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node4 | 6.004m | 2025-11-17 21:12:26.244 | 15 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 6.004m | 2025-11-17 21:12:26.252 | 16 | INFO | STARTUP | <main> | StartupStateUtils: | The following saved states were found on disk: | |
| - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/309 - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/174 - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/38 - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 | |||||||||
| node4 | 6.004m | 2025-11-17 21:12:26.252 | 17 | INFO | STARTUP | <main> | StartupStateUtils: | Loading latest state from disk. | |
| node4 | 6.004m | 2025-11-17 21:12:26.253 | 18 | INFO | STARTUP | <main> | StartupStateUtils: | Loading signed state from disk: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/309 | |
| node4 | 6.005m | 2025-11-17 21:12:26.261 | 19 | INFO | STATE_TO_DISK | <main> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp | |
| node4 | 6.007m | 2025-11-17 21:12:26.385 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node4 | 6m 1.253s | 2025-11-17 21:12:27.240 | 31 | INFO | STARTUP | <main> | StartupStateUtils: | Loaded state's hash is the same as when it was saved. | |
| node4 | 6m 1.259s | 2025-11-17 21:12:27.246 | 32 | INFO | STARTUP | <main> | StartupStateUtils: | Platform has loaded a saved state {"round":309,"consensusTimestamp":"2025-11-17T21:09:00.190305213Z"} [com.swirlds.logging.legacy.payload.SavedStateLoadedPayload] | |
| node4 | 6m 1.265s | 2025-11-17 21:12:27.252 | 35 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 6m 1.267s | 2025-11-17 21:12:27.254 | 38 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node4 | 6m 1.272s | 2025-11-17 21:12:27.259 | 39 | INFO | STARTUP | <main> | AddressBookInitializer: | Using the loaded state's address book and weight values. | |
| node4 | 6m 1.284s | 2025-11-17 21:12:27.271 | 40 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 6m 1.287s | 2025-11-17 21:12:27.274 | 41 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 6m 2.391s | 2025-11-17 21:12:28.378 | 42 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26159879] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=203060, randomLong=5399928805053717533, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=12290, randomLong=-6562945365866171511, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1308720, data=35, exception=null] OS Health Check Report - Complete (took 1025 ms) | |||||||||
| node4 | 6m 2.425s | 2025-11-17 21:12:28.412 | 43 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node4 | 6m 2.561s | 2025-11-17 21:12:28.548 | 44 | INFO | STARTUP | <main> | PcesUtilities: | Span compaction completed for data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+06+41.140869703Z_seq0_minr1_maxr501_orgn0.pces, new upper bound is 395 | |
| node4 | 6m 2.565s | 2025-11-17 21:12:28.552 | 45 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node4 | 6m 2.567s | 2025-11-17 21:12:28.554 | 46 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node4 | 6m 2.661s | 2025-11-17 21:12:28.648 | 47 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "IjgVaw==", "port": 30124 }, { "ipAddressV4": "CoAAGA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "I8DbAg==", "port": 30125 }, { "ipAddressV4": "CoAAGQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "IjlJoA==", "port": 30126 }, { "ipAddressV4": "CoAAHA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "aJuKEQ==", "port": 30127 }, { "ipAddressV4": "CoAAGg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "IhuUGg==", "port": 30128 }, { "ipAddressV4": "CoAAGw==", "port": 30128 }] }] } | |||||||||
| node4 | 6m 2.687s | 2025-11-17 21:12:28.674 | 48 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | State initialized with state long 6551453575234285285. | |
| node4 | 6m 2.688s | 2025-11-17 21:12:28.675 | 49 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | State initialized with 309 rounds handled. | |
| node4 | 6m 2.689s | 2025-11-17 21:12:28.676 | 50 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv | |
| node4 | 6m 2.689s | 2025-11-17 21:12:28.676 | 51 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Log file found. Parsing previous history | |
| node4 | 6m 2.735s | 2025-11-17 21:12:28.722 | 52 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 309 Timestamp: 2025-11-17T21:09:00.190305213Z Next consensus number: 10898 Legacy running event hash: 28282c7a1356e3441ed8a979521f15357c67be794c7bfa6593febb60908c1d7f8c594ab462953e6ab631678c1b8fe1ea Legacy running event mnemonic: agree-monster-hello-nurse Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -617936686 Root hash: f29f0997b970a387e63a252b6440e70e3ef2e3dae051a9d44bafd4f741150dded73917af4d57757a24ee164ac28757b1 (root) VirtualMap state / promote-cream-spirit-monkey | |||||||||
| node4 | 6m 2.741s | 2025-11-17 21:12:28.728 | 54 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Starting the ReconnectController | |
| node4 | 6m 2.941s | 2025-11-17 21:12:28.928 | 55 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 28282c7a1356e3441ed8a979521f15357c67be794c7bfa6593febb60908c1d7f8c594ab462953e6ab631678c1b8fe1ea | |
| node4 | 6m 2.951s | 2025-11-17 21:12:28.938 | 57 | INFO | STARTUP | <platformForkJoinThread-5> | Shadowgraph: | Shadowgraph starting from expiration threshold 282 | |
| node4 | 6m 2.957s | 2025-11-17 21:12:28.944 | 58 | INFO | STARTUP | <<start-node-4>> | ConsistencyTestingToolMain: | init called in Main for node 4. | |
| node4 | 6m 2.957s | 2025-11-17 21:12:28.944 | 59 | INFO | STARTUP | <<start-node-4>> | SwirldsPlatform: | Starting platform 4 | |
| node4 | 6m 2.958s | 2025-11-17 21:12:28.945 | 60 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node4 | 6m 2.962s | 2025-11-17 21:12:28.949 | 61 | INFO | STARTUP | <<start-node-4>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node4 | 6m 2.963s | 2025-11-17 21:12:28.950 | 62 | INFO | STARTUP | <<start-node-4>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node4 | 6m 2.963s | 2025-11-17 21:12:28.950 | 63 | INFO | STARTUP | <<start-node-4>> | InputWireChecks: | All input wires have been bound. | |
| node4 | 6m 2.965s | 2025-11-17 21:12:28.952 | 64 | INFO | STARTUP | <<start-node-4>> | SwirldsPlatform: | replaying preconsensus event stream starting at 282 | |
| node4 | 6m 2.972s | 2025-11-17 21:12:28.959 | 65 | INFO | PLATFORM_STATUS | <platformForkJoinThread-6> | StatusStateMachine: | Platform spent 170.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node4 | 6m 3.229s | 2025-11-17 21:12:29.216 | 66 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:2 H:3ea3a4285d4b BR:307), num remaining: 4 | |
| node4 | 6m 3.229s | 2025-11-17 21:12:29.216 | 67 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:4 H:202031169173 BR:307), num remaining: 3 | |
| node4 | 6m 3.230s | 2025-11-17 21:12:29.217 | 68 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:3 H:f34b0dc1301e BR:307), num remaining: 2 | |
| node4 | 6m 3.230s | 2025-11-17 21:12:29.217 | 69 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:1 H:c0a98fdca037 BR:307), num remaining: 1 | |
| node4 | 6m 3.232s | 2025-11-17 21:12:29.219 | 70 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:0 H:0bcabb9b32c8 BR:307), num remaining: 0 | |
| node4 | 6m 3.894s | 2025-11-17 21:12:29.881 | 738 | INFO | STARTUP | <<start-node-4>> | PcesReplayer: | Replayed 4,232 preconsensus events with max birth round 395. These events contained 5,828 transactions. 85 rounds reached consensus spanning 39.6 seconds of consensus time. The latest round to reach consensus is round 394. Replay took 927.0 milliseconds. | |
| node4 | 6m 3.899s | 2025-11-17 21:12:29.886 | 739 | INFO | STARTUP | <<app: appMain 4>> | ConsistencyTestingToolMain: | run called in Main. | |
| node4 | 6m 3.901s | 2025-11-17 21:12:29.888 | 740 | INFO | PLATFORM_STATUS | <platformForkJoinThread-7> | StatusStateMachine: | Platform spent 927.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node4 | 6m 4.776s | 2025-11-17 21:12:30.763 | 847 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Preparing for reconnect, stopping gossip | |
| node4 | 6m 4.777s | 2025-11-17 21:12:30.764 | 849 | INFO | RECONNECT | <<platform-core: SyncProtocolWith3 4 to 3>> | RpcPeerHandler: | SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=394,newEventBirthRound=395,ancientThreshold=366,expiredThreshold=293] remote ev=EventWindow[latestConsensusRound=786,newEventBirthRound=787,ancientThreshold=759,expiredThreshold=685] | |
| node4 | 6m 4.777s | 2025-11-17 21:12:30.764 | 850 | INFO | RECONNECT | <<platform-core: SyncProtocolWith0 4 to 0>> | RpcPeerHandler: | SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=394,newEventBirthRound=395,ancientThreshold=366,expiredThreshold=293] remote ev=EventWindow[latestConsensusRound=786,newEventBirthRound=787,ancientThreshold=759,expiredThreshold=685] | |
| node4 | 6m 4.777s | 2025-11-17 21:12:30.764 | 851 | INFO | RECONNECT | <<platform-core: SyncProtocolWith2 4 to 2>> | RpcPeerHandler: | SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=394,newEventBirthRound=395,ancientThreshold=366,expiredThreshold=293] remote ev=EventWindow[latestConsensusRound=786,newEventBirthRound=787,ancientThreshold=759,expiredThreshold=685] | |
| node4 | 6m 4.777s | 2025-11-17 21:12:30.764 | 848 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | RpcPeerHandler: | SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=394,newEventBirthRound=395,ancientThreshold=366,expiredThreshold=293] remote ev=EventWindow[latestConsensusRound=786,newEventBirthRound=787,ancientThreshold=759,expiredThreshold=685] | |
| node4 | 6m 4.777s | 2025-11-17 21:12:30.764 | 852 | INFO | PLATFORM_STATUS | <platformForkJoinThread-8> | StatusStateMachine: | Platform spent 874.0 ms in OBSERVING. Now in BEHIND | |
| node4 | 6m 4.777s | 2025-11-17 21:12:30.764 | 853 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Preparing for reconnect, start clearing queues | |
| node0 | 6m 4.847s | 2025-11-17 21:12:30.834 | 8990 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 0 to 4>> | RpcPeerHandler: | OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=786,newEventBirthRound=787,ancientThreshold=759,expiredThreshold=685] remote ev=EventWindow[latestConsensusRound=394,newEventBirthRound=395,ancientThreshold=366,expiredThreshold=293] | |
| node1 | 6m 4.847s | 2025-11-17 21:12:30.834 | 9036 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | RpcPeerHandler: | OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=786,newEventBirthRound=787,ancientThreshold=759,expiredThreshold=685] remote ev=EventWindow[latestConsensusRound=394,newEventBirthRound=395,ancientThreshold=366,expiredThreshold=293] | |
| node2 | 6m 4.847s | 2025-11-17 21:12:30.834 | 9212 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 2 to 4>> | RpcPeerHandler: | OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=786,newEventBirthRound=787,ancientThreshold=759,expiredThreshold=685] remote ev=EventWindow[latestConsensusRound=394,newEventBirthRound=395,ancientThreshold=366,expiredThreshold=293] | |
| node3 | 6m 4.847s | 2025-11-17 21:12:30.834 | 8976 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 3 to 4>> | RpcPeerHandler: | OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=786,newEventBirthRound=787,ancientThreshold=759,expiredThreshold=685] remote ev=EventWindow[latestConsensusRound=394,newEventBirthRound=395,ancientThreshold=366,expiredThreshold=293] | |
| node4 | 6m 4.929s | 2025-11-17 21:12:30.916 | 854 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Queues have been cleared | |
| node4 | 6m 4.930s | 2025-11-17 21:12:30.917 | 855 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Waiting for a state to be obtained from a peer | |
| node1 | 6m 5.024s | 2025-11-17 21:12:31.011 | 9040 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | ReconnectStateTeacher: | Starting reconnect in the role of the sender {"receiving":false,"nodeId":1,"otherNodeId":4,"round":786} [com.swirlds.logging.legacy.payload.ReconnectStartPayload] | |
| node1 | 6m 5.025s | 2025-11-17 21:12:31.012 | 9041 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | ReconnectStateTeacher: | The following state will be sent to the learner: | |
| Round: 786 Timestamp: 2025-11-17T21:12:29.550175Z Next consensus number: 23468 Legacy running event hash: 67f762841e3c9fc5158e20aec6b73dd2170a91edfd87c11bbc22d835df5d308ff1c61c40888e718fb1fba4acbbc219aa Legacy running event mnemonic: year-explain-shy-alter Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -559176979 Root hash: ae732455aef609917615054df9d453c033f8ec7cf081616aee68480dceaf94220d5f5ccdc3e4f925dd7502de51461960 (root) VirtualMap state / coyote-melt-candy-cram | |||||||||
| node1 | 6m 5.025s | 2025-11-17 21:12:31.012 | 9042 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | ReconnectStateTeacher: | Sending signatures from nodes 1, 2, 3 (signing weight = 37500000000/50000000000) for state hash ae732455aef609917615054df9d453c033f8ec7cf081616aee68480dceaf94220d5f5ccdc3e4f925dd7502de51461960 | |
| node1 | 6m 5.026s | 2025-11-17 21:12:31.013 | 9043 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | ReconnectStateTeacher: | Starting synchronization in the role of the sender. | |
| node4 | 6m 5.094s | 2025-11-17 21:12:31.081 | 856 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | ReconnectStatePeerProtocol: | Starting reconnect in role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":1,"round":394} [com.swirlds.logging.legacy.payload.ReconnectStartPayload] | |
| node4 | 6m 5.095s | 2025-11-17 21:12:31.082 | 857 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | ReconnectStateLearner: | Receiving signed state signatures | |
| node4 | 6m 5.096s | 2025-11-17 21:12:31.083 | 858 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | ReconnectStateLearner: | Received signatures from nodes 1, 2, 3 | |
| node1 | 6m 5.148s | 2025-11-17 21:12:31.135 | 9059 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | TeachingSynchronizer: | sending tree rooted at com.swirlds.virtualmap.VirtualMap with route [] | |
| node1 | 6m 5.156s | 2025-11-17 21:12:31.143 | 9060 | INFO | RECONNECT | <<work group teaching-synchronizer: async-input-stream #0>> | AsyncInputStream: | com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@3fe7750b start run() | |
| node4 | 6m 5.308s | 2025-11-17 21:12:31.295 | 885 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | learner calls receiveTree() | |
| node4 | 6m 5.308s | 2025-11-17 21:12:31.295 | 886 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | synchronizing tree | |
| node4 | 6m 5.309s | 2025-11-17 21:12:31.296 | 887 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | receiving tree rooted at com.swirlds.virtualmap.VirtualMap with route [] | |
| node4 | 6m 5.316s | 2025-11-17 21:12:31.303 | 888 | INFO | RECONNECT | <<work group learning-synchronizer: async-input-stream #0>> | AsyncInputStream: | com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@7b5eb9bb start run() | |
| node4 | 6m 5.375s | 2025-11-17 21:12:31.362 | 889 | INFO | RECONNECT | <<work group learning-synchronizer: async-input-stream #0>> | ReconnectNodeRemover: | setPathInformation(): firstLeafPath: 4 -> 4, lastLeafPath: 8 -> 8 | |
| node4 | 6m 5.375s | 2025-11-17 21:12:31.362 | 890 | INFO | RECONNECT | <<work group learning-synchronizer: async-input-stream #0>> | ReconnectNodeRemover: | setPathInformation(): done | |
| node4 | 6m 5.528s | 2025-11-17 21:12:31.515 | 891 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushTask: | learner thread finished the learning loop for the current subtree | |
| node4 | 6m 5.529s | 2025-11-17 21:12:31.516 | 892 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushVirtualTreeView: | call nodeRemover.allNodesReceived() | |
| node4 | 6m 5.529s | 2025-11-17 21:12:31.516 | 893 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | ReconnectNodeRemover: | allNodesReceived() | |
| node4 | 6m 5.530s | 2025-11-17 21:12:31.517 | 894 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | ReconnectNodeRemover: | allNodesReceived(): done | |
| node4 | 6m 5.530s | 2025-11-17 21:12:31.517 | 895 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushVirtualTreeView: | call root.endLearnerReconnect() | |
| node4 | 6m 5.530s | 2025-11-17 21:12:31.517 | 896 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | VirtualMap: | call reconnectIterator.close() | |
| node4 | 6m 5.530s | 2025-11-17 21:12:31.517 | 897 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | VirtualMap: | call setHashPrivate() | |
| node4 | 6m 5.552s | 2025-11-17 21:12:31.539 | 907 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | VirtualMap: | call postInit() | |
| node4 | 6m 5.552s | 2025-11-17 21:12:31.539 | 909 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | VirtualMap: | endLearnerReconnect() complete | |
| node4 | 6m 5.552s | 2025-11-17 21:12:31.539 | 910 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushVirtualTreeView: | close() complete | |
| node4 | 6m 5.553s | 2025-11-17 21:12:31.540 | 911 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushTask: | learner thread closed input, output, and view for the current subtree | |
| node4 | 6m 5.553s | 2025-11-17 21:12:31.540 | 912 | INFO | RECONNECT | <<work group learning-synchronizer: async-input-stream #0>> | AsyncInputStream: | com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@7b5eb9bb finish run() | |
| node4 | 6m 5.554s | 2025-11-17 21:12:31.541 | 913 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | received tree rooted at com.swirlds.virtualmap.VirtualMap with route [] | |
| node4 | 6m 5.554s | 2025-11-17 21:12:31.541 | 914 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | synchronization complete | |
| node4 | 6m 5.554s | 2025-11-17 21:12:31.541 | 915 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | learner calls initialize() | |
| node4 | 6m 5.555s | 2025-11-17 21:12:31.542 | 916 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | initializing tree | |
| node4 | 6m 5.555s | 2025-11-17 21:12:31.542 | 917 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | initialization complete | |
| node4 | 6m 5.555s | 2025-11-17 21:12:31.542 | 918 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | learner calls hash() | |
| node4 | 6m 5.555s | 2025-11-17 21:12:31.542 | 919 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | hashing tree | |
| node4 | 6m 5.555s | 2025-11-17 21:12:31.542 | 920 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | hashing complete | |
| node4 | 6m 5.555s | 2025-11-17 21:12:31.542 | 921 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | learner calls logStatistics() | |
| node4 | 6m 5.558s | 2025-11-17 21:12:31.545 | 922 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | Finished synchronization {"timeInSeconds":0.245,"hashTimeInSeconds":0.0,"initializationTimeInSeconds":0.0,"totalNodes":9,"leafNodes":5,"redundantLeafNodes":2,"internalNodes":4,"redundantInternalNodes":0} [com.swirlds.logging.legacy.payload.SynchronizationCompletePayload] | |
| node4 | 6m 5.558s | 2025-11-17 21:12:31.545 | 923 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | ReconnectMapMetrics: transfersFromTeacher=9; transfersFromLearner=8; internalHashes=3; internalCleanHashes=0; internalData=0; internalCleanData=0; leafHashes=5; leafCleanHashes=2; leafData=5; leafCleanData=2 | |
| node4 | 6m 5.559s | 2025-11-17 21:12:31.546 | 924 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | learner is done synchronizing | |
| node4 | 6m 5.560s | 2025-11-17 21:12:31.547 | 925 | INFO | STARTUP | <<platform-core: SyncProtocolWith1 4 to 1>> | ConsistencyTestingToolState: | New State Constructed. | |
| node4 | 6m 5.564s | 2025-11-17 21:12:31.551 | 926 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | ReconnectStateLearner: | Reconnect data usage report {"dataMegabytes":0.005864143371582031} [com.swirlds.logging.legacy.payload.ReconnectDataUsagePayload] | |
| node1 | 6m 5.591s | 2025-11-17 21:12:31.578 | 9064 | INFO | RECONNECT | <<work group teaching-synchronizer: async-input-stream #0>> | AsyncInputStream: | com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@3fe7750b finish run() | |
| node1 | 6m 5.595s | 2025-11-17 21:12:31.582 | 9065 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | TeachingSynchronizer: | finished sending tree | |
| node1 | 6m 5.598s | 2025-11-17 21:12:31.585 | 9078 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | ReconnectStateTeacher: | Finished synchronization in the role of the sender. | |
| node1 | 6m 5.637s | 2025-11-17 21:12:31.624 | 9079 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | ReconnectStateTeacher: | Finished reconnect in the role of the sender. {"receiving":false,"nodeId":1,"otherNodeId":4,"round":786} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload] | |
| node4 | 6m 5.674s | 2025-11-17 21:12:31.661 | 927 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | ReconnectStatePeerProtocol: | Finished reconnect in the role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":1,"round":786} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload] | |
| node4 | 6m 5.676s | 2025-11-17 21:12:31.663 | 928 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | ReconnectStatePeerProtocol: | Information for state received during reconnect: | |
| Round: 786 Timestamp: 2025-11-17T21:12:29.550175Z Next consensus number: 23468 Legacy running event hash: 67f762841e3c9fc5158e20aec6b73dd2170a91edfd87c11bbc22d835df5d308ff1c61c40888e718fb1fba4acbbc219aa Legacy running event mnemonic: year-explain-shy-alter Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -559176979 Root hash: ae732455aef609917615054df9d453c033f8ec7cf081616aee68480dceaf94220d5f5ccdc3e4f925dd7502de51461960 (root) VirtualMap state / coyote-melt-candy-cram | |||||||||
| node4 | 6m 5.677s | 2025-11-17 21:12:31.664 | 929 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | A state was obtained from a peer | |
| node4 | 6m 5.679s | 2025-11-17 21:12:31.666 | 930 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | The state obtained from a peer was validated | |
| node4 | 6m 5.679s | 2025-11-17 21:12:31.666 | 932 | DEBUG | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | `loadState` : reloading state | |
| node4 | 6m 5.680s | 2025-11-17 21:12:31.667 | 933 | INFO | STARTUP | <<platform-core: reconnectController>> | ConsistencyTestingToolState: | State initialized with state long 2986304853824918424. | |
| node4 | 6m 5.681s | 2025-11-17 21:12:31.668 | 934 | INFO | STARTUP | <<platform-core: reconnectController>> | ConsistencyTestingToolState: | State initialized with 786 rounds handled. | |
| node4 | 6m 5.681s | 2025-11-17 21:12:31.668 | 935 | INFO | STARTUP | <<platform-core: reconnectController>> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv | |
| node4 | 6m 5.681s | 2025-11-17 21:12:31.668 | 936 | INFO | STARTUP | <<platform-core: reconnectController>> | TransactionHandlingHistory: | Log file found. Parsing previous history | |
| node4 | 6m 5.699s | 2025-11-17 21:12:31.686 | 941 | INFO | STATE_TO_DISK | <<platform-core: reconnectController>> | DefaultSavedStateController: | Signed state from round 786 created, will eventually be written to disk, for reason: RECONNECT | |
| node4 | 6m 5.699s | 2025-11-17 21:12:31.686 | 942 | INFO | PLATFORM_STATUS | <platformForkJoinThread-7> | StatusStateMachine: | Platform spent 921.0 ms in BEHIND. Now in RECONNECT_COMPLETE | |
| node4 | 6m 5.700s | 2025-11-17 21:12:31.687 | 944 | INFO | STARTUP | <platformForkJoinThread-8> | Shadowgraph: | Shadowgraph starting from expiration threshold 759 | |
| node4 | 6m 5.702s | 2025-11-17 21:12:31.689 | 946 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 786 state to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/786 | |
| node4 | 6m 5.703s | 2025-11-17 21:12:31.690 | 947 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/3 for com.swirlds.demo.consistency.ConsistencyTestingToolState@70cade94 | |
| node4 | 6m 5.713s | 2025-11-17 21:12:31.700 | 955 | INFO | EVENT_STREAM | <<platform-core: reconnectController>> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 67f762841e3c9fc5158e20aec6b73dd2170a91edfd87c11bbc22d835df5d308ff1c61c40888e718fb1fba4acbbc219aa | |
| node4 | 6m 5.714s | 2025-11-17 21:12:31.701 | 956 | INFO | STARTUP | <platformForkJoinThread-5> | PcesFileManager: | Due to recent operations on this node, the local preconsensus event stream will have a discontinuity. The last file with the old origin round is 2025-11-17T21+06+41.140869703Z_seq0_minr1_maxr395_orgn0.pces. All future files will have an origin round of 786. | |
| node4 | 6m 5.714s | 2025-11-17 21:12:31.701 | 957 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Reconnect almost done resuming gossip | |
| node4 | 6m 5.873s | 2025-11-17 21:12:31.860 | 985 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/3 for com.swirlds.demo.consistency.ConsistencyTestingToolState@70cade94 | |
| node4 | 6m 5.876s | 2025-11-17 21:12:31.863 | 986 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 786 Timestamp: 2025-11-17T21:12:29.550175Z Next consensus number: 23468 Legacy running event hash: 67f762841e3c9fc5158e20aec6b73dd2170a91edfd87c11bbc22d835df5d308ff1c61c40888e718fb1fba4acbbc219aa Legacy running event mnemonic: year-explain-shy-alter Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -559176979 Root hash: ae732455aef609917615054df9d453c033f8ec7cf081616aee68480dceaf94220d5f5ccdc3e4f925dd7502de51461960 (root) VirtualMap state / coyote-melt-candy-cram | |||||||||
| node4 | 6m 5.911s | 2025-11-17 21:12:31.898 | 987 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+06+41.140869703Z_seq0_minr1_maxr395_orgn0.pces | |||||||||
| node4 | 6m 5.911s | 2025-11-17 21:12:31.898 | 988 | WARN | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | No preconsensus event files meeting specified criteria found to copy. Lower bound: 759 | |
| node4 | 6m 5.921s | 2025-11-17 21:12:31.908 | 989 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 786 to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/786 {"round":786,"freezeState":false,"reason":"RECONNECT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/786/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 6m 5.926s | 2025-11-17 21:12:31.913 | 990 | INFO | PLATFORM_STATUS | <platformForkJoinThread-7> | StatusStateMachine: | Platform spent 225.0 ms in RECONNECT_COMPLETE. Now in CHECKING | |
| node4 | 6m 5.971s | 2025-11-17 21:12:31.958 | 991 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ] | |
| node4 | 6m 5.974s | 2025-11-17 21:12:31.961 | 992 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node4 | 6m 6.862s | 2025-11-17 21:12:32.849 | 993 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:2 H:e7230356831b BR:784), num remaining: 3 | |
| node4 | 6m 6.863s | 2025-11-17 21:12:32.850 | 994 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:0 H:928ffa99acc8 BR:784), num remaining: 2 | |
| node4 | 6m 6.864s | 2025-11-17 21:12:32.851 | 995 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:1 H:fc1ed387b3e2 BR:784), num remaining: 1 | |
| node4 | 6m 6.864s | 2025-11-17 21:12:32.851 | 996 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:3 H:08b04cbdb5f2 BR:785), num remaining: 0 | |
| node4 | 6m 10.770s | 2025-11-17 21:12:36.757 | 1133 | INFO | PLATFORM_STATUS | <platformForkJoinThread-4> | StatusStateMachine: | Platform spent 4.8 s in CHECKING. Now in ACTIVE | |
| node3 | 6m 35.410s | 2025-11-17 21:13:01.397 | 9720 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 855 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 6m 35.507s | 2025-11-17 21:13:01.494 | 9947 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 855 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 6m 35.542s | 2025-11-17 21:13:01.529 | 9801 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 855 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node0 | 6m 35.571s | 2025-11-17 21:13:01.558 | 9716 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 855 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node0 | 6m 35.639s | 2025-11-17 21:13:01.626 | 9719 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 855 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/855 | |
| node0 | 6m 35.640s | 2025-11-17 21:13:01.627 | 9720 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for com.swirlds.demo.consistency.ConsistencyTestingToolState@7a7ace69 | |
| node1 | 6m 35.646s | 2025-11-17 21:13:01.633 | 9814 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 855 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/855 | |
| node1 | 6m 35.646s | 2025-11-17 21:13:01.633 | 9815 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/52 for com.swirlds.demo.consistency.ConsistencyTestingToolState@2bc866c6 | |
| node4 | 6m 35.649s | 2025-11-17 21:13:01.636 | 1712 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 855 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node4 | 6m 35.657s | 2025-11-17 21:13:01.644 | 1715 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 855 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/855 | |
| node4 | 6m 35.658s | 2025-11-17 21:13:01.645 | 1716 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/10 for com.swirlds.demo.consistency.ConsistencyTestingToolState@16cf7065 | |
| node0 | 6m 35.719s | 2025-11-17 21:13:01.706 | 9753 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for com.swirlds.demo.consistency.ConsistencyTestingToolState@7a7ace69 | |
| node0 | 6m 35.721s | 2025-11-17 21:13:01.708 | 9754 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 855 Timestamp: 2025-11-17T21:13:00.408106Z Next consensus number: 25801 Legacy running event hash: e8416ea822e5f5dd1b12b1f8e40ce9688fc385326133b55ca0b6650d0fd43c37696e0ad970c91336277349cbc28b034b Legacy running event mnemonic: wave-steak-frequent-dutch Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1547219963 Root hash: aecb90af3d4b7389061201ffa779cfab43c1ecf2db0f1eb48588c58e3ca56ba5e380611a555d36fb0b345799ecae53a9 (root) VirtualMap state / chase-tuition-practice-illness | |||||||||
| node0 | 6m 35.727s | 2025-11-17 21:13:01.714 | 9755 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+10+27.186566834Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+06+40.946965742Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 6m 35.727s | 2025-11-17 21:13:01.714 | 9756 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 828 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+10+27.186566834Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 6m 35.727s | 2025-11-17 21:13:01.714 | 9846 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/52 for com.swirlds.demo.consistency.ConsistencyTestingToolState@2bc866c6 | |
| node1 | 6m 35.728s | 2025-11-17 21:13:01.715 | 9847 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 855 Timestamp: 2025-11-17T21:13:00.408106Z Next consensus number: 25801 Legacy running event hash: e8416ea822e5f5dd1b12b1f8e40ce9688fc385326133b55ca0b6650d0fd43c37696e0ad970c91336277349cbc28b034b Legacy running event mnemonic: wave-steak-frequent-dutch Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1547219963 Root hash: aecb90af3d4b7389061201ffa779cfab43c1ecf2db0f1eb48588c58e3ca56ba5e380611a555d36fb0b345799ecae53a9 (root) VirtualMap state / chase-tuition-practice-illness | |||||||||
| node0 | 6m 35.730s | 2025-11-17 21:13:01.717 | 9757 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 6m 35.734s | 2025-11-17 21:13:01.721 | 9848 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+06+41.121642458Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+10+27.157584500Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 6m 35.735s | 2025-11-17 21:13:01.722 | 9849 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 828 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+10+27.157584500Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 6m 35.735s | 2025-11-17 21:13:01.722 | 9850 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 6m 35.736s | 2025-11-17 21:13:01.723 | 9758 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 6m 35.737s | 2025-11-17 21:13:01.724 | 9759 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 855 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/855 {"round":855,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/855/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 6m 35.738s | 2025-11-17 21:13:01.725 | 9760 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/174 | |
| node1 | 6m 35.742s | 2025-11-17 21:13:01.729 | 9851 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 6m 35.742s | 2025-11-17 21:13:01.729 | 9852 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 855 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/855 {"round":855,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/855/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 6m 35.744s | 2025-11-17 21:13:01.731 | 9853 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/174 | |
| node3 | 6m 35.758s | 2025-11-17 21:13:01.745 | 9735 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 855 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/855 | |
| node3 | 6m 35.759s | 2025-11-17 21:13:01.746 | 9736 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for com.swirlds.demo.consistency.ConsistencyTestingToolState@42de99da | |
| node4 | 6m 35.786s | 2025-11-17 21:13:01.773 | 1769 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/10 for com.swirlds.demo.consistency.ConsistencyTestingToolState@16cf7065 | |
| node4 | 6m 35.788s | 2025-11-17 21:13:01.775 | 1770 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 855 Timestamp: 2025-11-17T21:13:00.408106Z Next consensus number: 25801 Legacy running event hash: e8416ea822e5f5dd1b12b1f8e40ce9688fc385326133b55ca0b6650d0fd43c37696e0ad970c91336277349cbc28b034b Legacy running event mnemonic: wave-steak-frequent-dutch Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1547219963 Root hash: aecb90af3d4b7389061201ffa779cfab43c1ecf2db0f1eb48588c58e3ca56ba5e380611a555d36fb0b345799ecae53a9 (root) VirtualMap state / chase-tuition-practice-illness | |||||||||
| node4 | 6m 35.798s | 2025-11-17 21:13:01.785 | 1771 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+06+41.140869703Z_seq0_minr1_maxr395_orgn0.pces Last file: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+12+32.278207882Z_seq1_minr759_maxr1259_orgn786.pces | |||||||||
| node4 | 6m 35.798s | 2025-11-17 21:13:01.785 | 1772 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 828 File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+12+32.278207882Z_seq1_minr759_maxr1259_orgn786.pces | |||||||||
| node4 | 6m 35.798s | 2025-11-17 21:13:01.785 | 1773 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 6m 35.803s | 2025-11-17 21:13:01.790 | 1774 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 6m 35.803s | 2025-11-17 21:13:01.790 | 1775 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 855 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/855 {"round":855,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/855/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 6m 35.805s | 2025-11-17 21:13:01.792 | 1776 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 | |
| node3 | 6m 35.835s | 2025-11-17 21:13:01.822 | 9770 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for com.swirlds.demo.consistency.ConsistencyTestingToolState@42de99da | |
| node3 | 6m 35.837s | 2025-11-17 21:13:01.824 | 9771 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 855 Timestamp: 2025-11-17T21:13:00.408106Z Next consensus number: 25801 Legacy running event hash: e8416ea822e5f5dd1b12b1f8e40ce9688fc385326133b55ca0b6650d0fd43c37696e0ad970c91336277349cbc28b034b Legacy running event mnemonic: wave-steak-frequent-dutch Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1547219963 Root hash: aecb90af3d4b7389061201ffa779cfab43c1ecf2db0f1eb48588c58e3ca56ba5e380611a555d36fb0b345799ecae53a9 (root) VirtualMap state / chase-tuition-practice-illness | |||||||||
| node3 | 6m 35.843s | 2025-11-17 21:13:01.830 | 9772 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+10+27.285649301Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+06+41.067149334Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 6m 35.844s | 2025-11-17 21:13:01.831 | 9773 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 828 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+10+27.285649301Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 6m 35.846s | 2025-11-17 21:13:01.833 | 9774 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 6m 35.850s | 2025-11-17 21:13:01.837 | 9960 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 855 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/855 | |
| node2 | 6m 35.851s | 2025-11-17 21:13:01.838 | 9961 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for com.swirlds.demo.consistency.ConsistencyTestingToolState@3b7455b0 | |
| node3 | 6m 35.853s | 2025-11-17 21:13:01.840 | 9775 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 6m 35.853s | 2025-11-17 21:13:01.840 | 9776 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 855 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/855 {"round":855,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/855/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 6m 35.854s | 2025-11-17 21:13:01.841 | 9777 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/174 | |
| node2 | 6m 35.934s | 2025-11-17 21:13:01.921 | 9995 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for com.swirlds.demo.consistency.ConsistencyTestingToolState@3b7455b0 | |
| node2 | 6m 35.936s | 2025-11-17 21:13:01.923 | 9996 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 855 Timestamp: 2025-11-17T21:13:00.408106Z Next consensus number: 25801 Legacy running event hash: e8416ea822e5f5dd1b12b1f8e40ce9688fc385326133b55ca0b6650d0fd43c37696e0ad970c91336277349cbc28b034b Legacy running event mnemonic: wave-steak-frequent-dutch Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1547219963 Root hash: aecb90af3d4b7389061201ffa779cfab43c1ecf2db0f1eb48588c58e3ca56ba5e380611a555d36fb0b345799ecae53a9 (root) VirtualMap state / chase-tuition-practice-illness | |||||||||
| node2 | 6m 35.944s | 2025-11-17 21:13:01.931 | 9997 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+10+27.225461213Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+06+41.228512734Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 6m 35.944s | 2025-11-17 21:13:01.931 | 9998 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 828 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+10+27.225461213Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 6m 35.947s | 2025-11-17 21:13:01.934 | 9999 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 6m 35.953s | 2025-11-17 21:13:01.940 | 10000 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 6m 35.954s | 2025-11-17 21:13:01.941 | 10001 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 855 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/855 {"round":855,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/855/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 6m 35.955s | 2025-11-17 21:13:01.942 | 10002 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/174 | |
| node2 | 7m 34.919s | 2025-11-17 21:14:00.906 | 11440 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 987 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 7m 34.921s | 2025-11-17 21:14:00.908 | 11300 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 987 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 7m 34.979s | 2025-11-17 21:14:00.966 | 11229 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 987 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node0 | 7m 35.010s | 2025-11-17 21:14:00.997 | 11237 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 987 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node4 | 7m 35.017s | 2025-11-17 21:14:01.004 | 3209 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 987 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node0 | 7m 35.099s | 2025-11-17 21:14:01.086 | 11240 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 987 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/987 | |
| node0 | 7m 35.100s | 2025-11-17 21:14:01.087 | 11241 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for com.swirlds.demo.consistency.ConsistencyTestingToolState@6c96d67 | |
| node3 | 7m 35.159s | 2025-11-17 21:14:01.146 | 11232 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 987 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/987 | |
| node3 | 7m 35.159s | 2025-11-17 21:14:01.146 | 11233 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for com.swirlds.demo.consistency.ConsistencyTestingToolState@52f1026b | |
| node2 | 7m 35.170s | 2025-11-17 21:14:01.157 | 11443 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 987 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/987 | |
| node2 | 7m 35.170s | 2025-11-17 21:14:01.157 | 11444 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for com.swirlds.demo.consistency.ConsistencyTestingToolState@576dc598 | |
| node0 | 7m 35.180s | 2025-11-17 21:14:01.167 | 11272 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for com.swirlds.demo.consistency.ConsistencyTestingToolState@6c96d67 | |
| node0 | 7m 35.182s | 2025-11-17 21:14:01.169 | 11273 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 987 Timestamp: 2025-11-17T21:14:00.002272Z Next consensus number: 30564 Legacy running event hash: a8c25818cefbac67743e9a96139cbd5e063dfafce1c65afb2152f0d29291caff61f5db5e0fa5eb510184e445aece88b8 Legacy running event mnemonic: treat-adapt-camera-industry Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1804571859 Root hash: 28ca7ea41e1d977824763a3658cfaa6f1604355f7eb29c70122f647880f2a24b1e70a92eb0ac267cd6f656a5f3fd5917 (root) VirtualMap state / bone-stand-senior-baby | |||||||||
| node0 | 7m 35.189s | 2025-11-17 21:14:01.176 | 11274 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+10+27.186566834Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+06+40.946965742Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 7m 35.189s | 2025-11-17 21:14:01.176 | 11275 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 960 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+10+27.186566834Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node0 | 7m 35.189s | 2025-11-17 21:14:01.176 | 11276 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 7m 35.199s | 2025-11-17 21:14:01.186 | 11277 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 7m 35.200s | 2025-11-17 21:14:01.187 | 11278 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 987 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/987 {"round":987,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/987/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 7m 35.201s | 2025-11-17 21:14:01.188 | 11279 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/309 | |
| node3 | 7m 35.239s | 2025-11-17 21:14:01.226 | 11264 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for com.swirlds.demo.consistency.ConsistencyTestingToolState@52f1026b | |
| node4 | 7m 35.240s | 2025-11-17 21:14:01.227 | 3212 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 987 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/987 | |
| node4 | 7m 35.240s | 2025-11-17 21:14:01.227 | 3213 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/17 for com.swirlds.demo.consistency.ConsistencyTestingToolState@1a4aeb8e | |
| node3 | 7m 35.241s | 2025-11-17 21:14:01.228 | 11265 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 987 Timestamp: 2025-11-17T21:14:00.002272Z Next consensus number: 30564 Legacy running event hash: a8c25818cefbac67743e9a96139cbd5e063dfafce1c65afb2152f0d29291caff61f5db5e0fa5eb510184e445aece88b8 Legacy running event mnemonic: treat-adapt-camera-industry Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1804571859 Root hash: 28ca7ea41e1d977824763a3658cfaa6f1604355f7eb29c70122f647880f2a24b1e70a92eb0ac267cd6f656a5f3fd5917 (root) VirtualMap state / bone-stand-senior-baby | |||||||||
| node3 | 7m 35.248s | 2025-11-17 21:14:01.235 | 11266 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+10+27.285649301Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+06+41.067149334Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 7m 35.248s | 2025-11-17 21:14:01.235 | 11267 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 960 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+10+27.285649301Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 7m 35.248s | 2025-11-17 21:14:01.235 | 11268 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 7m 35.251s | 2025-11-17 21:14:01.238 | 11483 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for com.swirlds.demo.consistency.ConsistencyTestingToolState@576dc598 | |
| node2 | 7m 35.253s | 2025-11-17 21:14:01.240 | 11484 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 987 Timestamp: 2025-11-17T21:14:00.002272Z Next consensus number: 30564 Legacy running event hash: a8c25818cefbac67743e9a96139cbd5e063dfafce1c65afb2152f0d29291caff61f5db5e0fa5eb510184e445aece88b8 Legacy running event mnemonic: treat-adapt-camera-industry Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1804571859 Root hash: 28ca7ea41e1d977824763a3658cfaa6f1604355f7eb29c70122f647880f2a24b1e70a92eb0ac267cd6f656a5f3fd5917 (root) VirtualMap state / bone-stand-senior-baby | |||||||||
| node2 | 7m 35.258s | 2025-11-17 21:14:01.245 | 11485 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+10+27.225461213Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+06+41.228512734Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 7m 35.258s | 2025-11-17 21:14:01.245 | 11486 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 960 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+10+27.225461213Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 7m 35.258s | 2025-11-17 21:14:01.245 | 11487 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 7m 35.258s | 2025-11-17 21:14:01.245 | 11269 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 7m 35.258s | 2025-11-17 21:14:01.245 | 11270 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 987 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/987 {"round":987,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/987/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 7m 35.260s | 2025-11-17 21:14:01.247 | 11271 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/309 | |
| node1 | 7m 35.263s | 2025-11-17 21:14:01.250 | 11313 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 987 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/987 | |
| node1 | 7m 35.263s | 2025-11-17 21:14:01.250 | 11314 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/59 for com.swirlds.demo.consistency.ConsistencyTestingToolState@2fef96ad | |
| node2 | 7m 35.268s | 2025-11-17 21:14:01.255 | 11488 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 7m 35.269s | 2025-11-17 21:14:01.256 | 11489 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 987 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/987 {"round":987,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/987/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 7m 35.270s | 2025-11-17 21:14:01.257 | 11490 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/309 | |
| node1 | 7m 35.343s | 2025-11-17 21:14:01.330 | 11348 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/59 for com.swirlds.demo.consistency.ConsistencyTestingToolState@2fef96ad | |
| node1 | 7m 35.345s | 2025-11-17 21:14:01.332 | 11349 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 987 Timestamp: 2025-11-17T21:14:00.002272Z Next consensus number: 30564 Legacy running event hash: a8c25818cefbac67743e9a96139cbd5e063dfafce1c65afb2152f0d29291caff61f5db5e0fa5eb510184e445aece88b8 Legacy running event mnemonic: treat-adapt-camera-industry Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1804571859 Root hash: 28ca7ea41e1d977824763a3658cfaa6f1604355f7eb29c70122f647880f2a24b1e70a92eb0ac267cd6f656a5f3fd5917 (root) VirtualMap state / bone-stand-senior-baby | |||||||||
| node1 | 7m 35.352s | 2025-11-17 21:14:01.339 | 11350 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+06+41.121642458Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+10+27.157584500Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 7m 35.352s | 2025-11-17 21:14:01.339 | 11351 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 960 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+10+27.157584500Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 7m 35.352s | 2025-11-17 21:14:01.339 | 11352 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 7m 35.354s | 2025-11-17 21:14:01.341 | 3250 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/17 for com.swirlds.demo.consistency.ConsistencyTestingToolState@1a4aeb8e | |
| node4 | 7m 35.356s | 2025-11-17 21:14:01.343 | 3251 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 987 Timestamp: 2025-11-17T21:14:00.002272Z Next consensus number: 30564 Legacy running event hash: a8c25818cefbac67743e9a96139cbd5e063dfafce1c65afb2152f0d29291caff61f5db5e0fa5eb510184e445aece88b8 Legacy running event mnemonic: treat-adapt-camera-industry Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1804571859 Root hash: 28ca7ea41e1d977824763a3658cfaa6f1604355f7eb29c70122f647880f2a24b1e70a92eb0ac267cd6f656a5f3fd5917 (root) VirtualMap state / bone-stand-senior-baby | |||||||||
| node1 | 7m 35.362s | 2025-11-17 21:14:01.349 | 11353 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 7m 35.362s | 2025-11-17 21:14:01.349 | 11354 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 987 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/987 {"round":987,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/987/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 7m 35.363s | 2025-11-17 21:14:01.350 | 3252 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+06+41.140869703Z_seq0_minr1_maxr395_orgn0.pces Last file: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+12+32.278207882Z_seq1_minr759_maxr1259_orgn786.pces | |||||||||
| node4 | 7m 35.363s | 2025-11-17 21:14:01.350 | 3253 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 960 File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+12+32.278207882Z_seq1_minr759_maxr1259_orgn786.pces | |||||||||
| node4 | 7m 35.363s | 2025-11-17 21:14:01.350 | 3254 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 7m 35.364s | 2025-11-17 21:14:01.351 | 11355 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/309 | |
| node4 | 7m 35.369s | 2025-11-17 21:14:01.356 | 3263 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 7m 35.369s | 2025-11-17 21:14:01.356 | 3264 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 987 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/987 {"round":987,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/987/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 7m 35.371s | 2025-11-17 21:14:01.358 | 3265 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/38 | |
| node2 | 7m 59.301s | 2025-11-17 21:14:25.288 | 12062 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith0 2 to 0>> | NetworkUtils: | Connection broken: 2 <- 0 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:14:25.285947054Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node4 | 7m 59.305s | 2025-11-17 21:14:25.292 | 3821 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith0 4 to 0>> | NetworkUtils: | Connection broken: 4 <- 0 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:14:25.289745309Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node4 | 7m 59.318s | 2025-11-17 21:14:25.305 | 3822 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith1 4 to 1>> | NetworkUtils: | Connection broken: 4 <- 1 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:14:25.303910308Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node2 | 7m 59.319s | 2025-11-17 21:14:25.306 | 12063 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith1 2 to 1>> | NetworkUtils: | Connection broken: 2 <- 1 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:14:25.303754412Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node2 | 7m 59.586s | 2025-11-17 21:14:25.573 | 12072 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith3 2 to 3>> | NetworkUtils: | Connection broken: 2 -> 3 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:14:25.571576221Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node4 | 7m 59.589s | 2025-11-17 21:14:25.576 | 3831 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith3 4 to 3>> | NetworkUtils: | Connection broken: 4 <- 3 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:14:25.571409467Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node2 | 8.001m | 2025-11-17 21:14:26.059 | 12073 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 2 to 4>> | NetworkUtils: | Connection broken: 2 -> 4 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:14:26.055147694Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||