| node4 | 0.000ns | 2025-11-17 21:51:34.244 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node4 | 89.000ms | 2025-11-17 21:51:34.333 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node4 | 105.000ms | 2025-11-17 21:51:34.349 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node4 | 215.000ms | 2025-11-17 21:51:34.459 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [4] are set to run locally | |
| node4 | 244.000ms | 2025-11-17 21:51:34.488 | 5 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node3 | 819.000ms | 2025-11-17 21:51:35.063 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node3 | 913.000ms | 2025-11-17 21:51:35.157 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node3 | 930.000ms | 2025-11-17 21:51:35.174 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node2 | 958.000ms | 2025-11-17 21:51:35.202 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node3 | 1.045s | 2025-11-17 21:51:35.289 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [3] are set to run locally | |
| node2 | 1.056s | 2025-11-17 21:51:35.300 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node2 | 1.074s | 2025-11-17 21:51:35.318 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node3 | 1.076s | 2025-11-17 21:51:35.320 | 5 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node2 | 1.197s | 2025-11-17 21:51:35.441 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [2] are set to run locally | |
| node2 | 1.229s | 2025-11-17 21:51:35.473 | 5 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node4 | 1.526s | 2025-11-17 21:51:35.770 | 6 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 1281ms | |
| node0 | 1.532s | 2025-11-17 21:51:35.776 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node4 | 1.535s | 2025-11-17 21:51:35.779 | 7 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node4 | 1.538s | 2025-11-17 21:51:35.782 | 8 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node4 | 1.573s | 2025-11-17 21:51:35.817 | 9 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node0 | 1.630s | 2025-11-17 21:51:35.874 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node4 | 1.635s | 2025-11-17 21:51:35.879 | 10 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node4 | 1.635s | 2025-11-17 21:51:35.879 | 11 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node0 | 1.650s | 2025-11-17 21:51:35.894 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node1 | 1.723s | 2025-11-17 21:51:35.967 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node0 | 1.777s | 2025-11-17 21:51:36.021 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [0] are set to run locally | |
| node0 | 1.812s | 2025-11-17 21:51:36.056 | 5 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node1 | 1.835s | 2025-11-17 21:51:36.079 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node1 | 1.855s | 2025-11-17 21:51:36.099 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node1 | 1.994s | 2025-11-17 21:51:36.238 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [1] are set to run locally | |
| node1 | 2.031s | 2025-11-17 21:51:36.275 | 5 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node3 | 2.418s | 2025-11-17 21:51:36.662 | 6 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 1341ms | |
| node3 | 2.427s | 2025-11-17 21:51:36.671 | 7 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node3 | 2.430s | 2025-11-17 21:51:36.674 | 8 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node4 | 2.460s | 2025-11-17 21:51:36.704 | 12 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node3 | 2.467s | 2025-11-17 21:51:36.711 | 9 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node3 | 2.528s | 2025-11-17 21:51:36.772 | 10 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node3 | 2.529s | 2025-11-17 21:51:36.773 | 11 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node4 | 2.551s | 2025-11-17 21:51:36.795 | 15 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 2.553s | 2025-11-17 21:51:36.797 | 16 | INFO | STARTUP | <main> | StartupStateUtils: | No saved states were found on disk. | |
| node4 | 2.588s | 2025-11-17 21:51:36.832 | 21 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node2 | 2.639s | 2025-11-17 21:51:36.883 | 6 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 1409ms | |
| node2 | 2.648s | 2025-11-17 21:51:36.892 | 7 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node2 | 2.651s | 2025-11-17 21:51:36.895 | 8 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node2 | 2.688s | 2025-11-17 21:51:36.932 | 9 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node2 | 2.751s | 2025-11-17 21:51:36.995 | 10 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node2 | 2.752s | 2025-11-17 21:51:36.996 | 11 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node0 | 3.305s | 2025-11-17 21:51:37.549 | 6 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 1491ms | |
| node0 | 3.314s | 2025-11-17 21:51:37.558 | 7 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node0 | 3.318s | 2025-11-17 21:51:37.562 | 8 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node4 | 3.348s | 2025-11-17 21:51:37.592 | 24 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 3.350s | 2025-11-17 21:51:37.594 | 27 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node0 | 3.355s | 2025-11-17 21:51:37.599 | 9 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node3 | 3.355s | 2025-11-17 21:51:37.599 | 12 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node4 | 3.355s | 2025-11-17 21:51:37.599 | 28 | INFO | STARTUP | <main> | AddressBookInitializer: | Starting from genesis: using the config address book. | |
| node4 | 3.365s | 2025-11-17 21:51:37.609 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 3.367s | 2025-11-17 21:51:37.611 | 30 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node0 | 3.422s | 2025-11-17 21:51:37.666 | 10 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node0 | 3.423s | 2025-11-17 21:51:37.667 | 11 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node3 | 3.452s | 2025-11-17 21:51:37.696 | 15 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node3 | 3.454s | 2025-11-17 21:51:37.698 | 16 | INFO | STARTUP | <main> | StartupStateUtils: | No saved states were found on disk. | |
| node3 | 3.491s | 2025-11-17 21:51:37.735 | 21 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node2 | 3.600s | 2025-11-17 21:51:37.844 | 12 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node1 | 3.629s | 2025-11-17 21:51:37.873 | 6 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 1597ms | |
| node1 | 3.638s | 2025-11-17 21:51:37.882 | 7 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node1 | 3.641s | 2025-11-17 21:51:37.885 | 8 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node1 | 3.679s | 2025-11-17 21:51:37.923 | 9 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node2 | 3.695s | 2025-11-17 21:51:37.939 | 15 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node2 | 3.697s | 2025-11-17 21:51:37.941 | 16 | INFO | STARTUP | <main> | StartupStateUtils: | No saved states were found on disk. | |
| node2 | 3.735s | 2025-11-17 21:51:37.979 | 21 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node1 | 3.744s | 2025-11-17 21:51:37.988 | 10 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node1 | 3.744s | 2025-11-17 21:51:37.988 | 11 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node3 | 4.287s | 2025-11-17 21:51:38.531 | 24 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node3 | 4.288s | 2025-11-17 21:51:38.532 | 27 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node3 | 4.293s | 2025-11-17 21:51:38.537 | 28 | INFO | STARTUP | <main> | AddressBookInitializer: | Starting from genesis: using the config address book. | |
| node3 | 4.302s | 2025-11-17 21:51:38.546 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node3 | 4.305s | 2025-11-17 21:51:38.549 | 30 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node0 | 4.314s | 2025-11-17 21:51:38.558 | 12 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node0 | 4.408s | 2025-11-17 21:51:38.652 | 15 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node0 | 4.411s | 2025-11-17 21:51:38.655 | 16 | INFO | STARTUP | <main> | StartupStateUtils: | No saved states were found on disk. | |
| node0 | 4.448s | 2025-11-17 21:51:38.692 | 21 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node4 | 4.480s | 2025-11-17 21:51:38.724 | 31 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26151018] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=244679, randomLong=6232702166528923541, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=7140, randomLong=6568526987997712554, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1308128, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms) | |||||||||
| node4 | 4.510s | 2025-11-17 21:51:38.754 | 32 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node4 | 4.518s | 2025-11-17 21:51:38.762 | 33 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node4 | 4.520s | 2025-11-17 21:51:38.764 | 34 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node1 | 4.531s | 2025-11-17 21:51:38.775 | 12 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node4 | 4.602s | 2025-11-17 21:51:38.846 | 35 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "InvOnA==", "port": 30124 }, { "ipAddressV4": "CoAAHQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "Ih3SfA==", "port": 30125 }, { "ipAddressV4": "CoAAIQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "iHQMHg==", "port": 30126 }, { "ipAddressV4": "CoAAHg==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "IkQRmQ==", "port": 30127 }, { "ipAddressV4": "CoAAHw==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "Ike+6w==", "port": 30128 }, { "ipAddressV4": "CoAAIA==", "port": 30128 }] }] } | |||||||||
| node2 | 4.624s | 2025-11-17 21:51:38.868 | 24 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 4.625s | 2025-11-17 21:51:38.869 | 36 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv | |
| node4 | 4.626s | 2025-11-17 21:51:38.870 | 37 | INFO | STARTUP | <main> | TransactionHandlingHistory: | No log file found. Starting without any previous history | |
| node1 | 4.627s | 2025-11-17 21:51:38.871 | 15 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node2 | 4.627s | 2025-11-17 21:51:38.871 | 27 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node1 | 4.630s | 2025-11-17 21:51:38.874 | 16 | INFO | STARTUP | <main> | StartupStateUtils: | No saved states were found on disk. | |
| node2 | 4.634s | 2025-11-17 21:51:38.878 | 28 | INFO | STARTUP | <main> | AddressBookInitializer: | Starting from genesis: using the config address book. | |
| node4 | 4.638s | 2025-11-17 21:51:38.882 | 38 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 61637a9249941b41469383786ea3e9cf47d7e3c2e8c4a83b6668ae7def5f6c79683a4d924180b5192cfc05b2082b9003 (root) VirtualMap state / cluster-false-crater-habit | |||||||||
| node4 | 4.641s | 2025-11-17 21:51:38.885 | 40 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Starting the ReconnectController | |
| node2 | 4.646s | 2025-11-17 21:51:38.890 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node2 | 4.649s | 2025-11-17 21:51:38.893 | 30 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node1 | 4.667s | 2025-11-17 21:51:38.911 | 21 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node4 | 4.837s | 2025-11-17 21:51:39.081 | 41 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b | |
| node4 | 4.842s | 2025-11-17 21:51:39.086 | 42 | INFO | STARTUP | <platformForkJoinThread-2> | Shadowgraph: | Shadowgraph starting from expiration threshold 1 | |
| node4 | 4.845s | 2025-11-17 21:51:39.089 | 43 | INFO | STARTUP | <<start-node-4>> | ConsistencyTestingToolMain: | init called in Main for node 4. | |
| node4 | 4.846s | 2025-11-17 21:51:39.090 | 44 | INFO | STARTUP | <<start-node-4>> | SwirldsPlatform: | Starting platform 4 | |
| node4 | 4.847s | 2025-11-17 21:51:39.091 | 45 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node4 | 4.850s | 2025-11-17 21:51:39.094 | 46 | INFO | STARTUP | <<start-node-4>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node4 | 4.851s | 2025-11-17 21:51:39.095 | 47 | INFO | STARTUP | <<start-node-4>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node4 | 4.852s | 2025-11-17 21:51:39.096 | 48 | INFO | STARTUP | <<start-node-4>> | InputWireChecks: | All input wires have been bound. | |
| node4 | 4.853s | 2025-11-17 21:51:39.097 | 49 | WARN | STARTUP | <<start-node-4>> | PcesFileTracker: | No preconsensus event files available | |
| node4 | 4.854s | 2025-11-17 21:51:39.098 | 50 | INFO | STARTUP | <<start-node-4>> | SwirldsPlatform: | replaying preconsensus event stream starting at 0 | |
| node4 | 4.856s | 2025-11-17 21:51:39.100 | 51 | INFO | STARTUP | <<start-node-4>> | PcesReplayer: | Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds. | |
| node4 | 4.857s | 2025-11-17 21:51:39.101 | 52 | INFO | STARTUP | <<app: appMain 4>> | ConsistencyTestingToolMain: | run called in Main. | |
| node4 | 4.858s | 2025-11-17 21:51:39.102 | 53 | INFO | PLATFORM_STATUS | <platformForkJoinThread-1> | StatusStateMachine: | Platform spent 163.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node4 | 4.863s | 2025-11-17 21:51:39.107 | 54 | INFO | PLATFORM_STATUS | <platformForkJoinThread-1> | StatusStateMachine: | Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node0 | 5.307s | 2025-11-17 21:51:39.551 | 24 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node0 | 5.310s | 2025-11-17 21:51:39.554 | 26 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node0 | 5.317s | 2025-11-17 21:51:39.561 | 28 | INFO | STARTUP | <main> | AddressBookInitializer: | Starting from genesis: using the config address book. | |
| node0 | 5.330s | 2025-11-17 21:51:39.574 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node0 | 5.333s | 2025-11-17 21:51:39.577 | 30 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node3 | 5.440s | 2025-11-17 21:51:39.684 | 31 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26182946] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=213480, randomLong=2264522527393534698, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10560, randomLong=-8093929113790353471, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1407520, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms) | |||||||||
| node3 | 5.470s | 2025-11-17 21:51:39.714 | 32 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node3 | 5.478s | 2025-11-17 21:51:39.722 | 33 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node3 | 5.480s | 2025-11-17 21:51:39.724 | 34 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node1 | 5.542s | 2025-11-17 21:51:39.786 | 24 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node1 | 5.544s | 2025-11-17 21:51:39.788 | 27 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node1 | 5.551s | 2025-11-17 21:51:39.795 | 28 | INFO | STARTUP | <main> | AddressBookInitializer: | Starting from genesis: using the config address book. | |
| node1 | 5.561s | 2025-11-17 21:51:39.805 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node1 | 5.564s | 2025-11-17 21:51:39.808 | 30 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node3 | 5.564s | 2025-11-17 21:51:39.808 | 35 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "InvOnA==", "port": 30124 }, { "ipAddressV4": "CoAAHQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "Ih3SfA==", "port": 30125 }, { "ipAddressV4": "CoAAIQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "iHQMHg==", "port": 30126 }, { "ipAddressV4": "CoAAHg==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "IkQRmQ==", "port": 30127 }, { "ipAddressV4": "CoAAHw==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "Ike+6w==", "port": 30128 }, { "ipAddressV4": "CoAAIA==", "port": 30128 }] }] } | |||||||||
| node3 | 5.589s | 2025-11-17 21:51:39.833 | 36 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/3/ConsistencyTestLog.csv | |
| node3 | 5.589s | 2025-11-17 21:51:39.833 | 37 | INFO | STARTUP | <main> | TransactionHandlingHistory: | No log file found. Starting without any previous history | |
| node3 | 5.602s | 2025-11-17 21:51:39.846 | 38 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 61637a9249941b41469383786ea3e9cf47d7e3c2e8c4a83b6668ae7def5f6c79683a4d924180b5192cfc05b2082b9003 (root) VirtualMap state / cluster-false-crater-habit | |||||||||
| node3 | 5.605s | 2025-11-17 21:51:39.849 | 40 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Starting the ReconnectController | |
| node2 | 5.785s | 2025-11-17 21:51:40.029 | 31 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26425389] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=145140, randomLong=-6759575777752376075, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=21690, randomLong=3189192013231276821, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1630380, data=35, exception=null] OS Health Check Report - Complete (took 1029 ms) | |||||||||
| node3 | 5.812s | 2025-11-17 21:51:40.056 | 41 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b | |
| node3 | 5.817s | 2025-11-17 21:51:40.061 | 42 | INFO | STARTUP | <platformForkJoinThread-2> | Shadowgraph: | Shadowgraph starting from expiration threshold 1 | |
| node3 | 5.822s | 2025-11-17 21:51:40.066 | 43 | INFO | STARTUP | <<start-node-3>> | ConsistencyTestingToolMain: | init called in Main for node 3. | |
| node3 | 5.823s | 2025-11-17 21:51:40.067 | 44 | INFO | STARTUP | <<start-node-3>> | SwirldsPlatform: | Starting platform 3 | |
| node3 | 5.824s | 2025-11-17 21:51:40.068 | 45 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node2 | 5.825s | 2025-11-17 21:51:40.069 | 32 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node3 | 5.827s | 2025-11-17 21:51:40.071 | 46 | INFO | STARTUP | <<start-node-3>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node3 | 5.829s | 2025-11-17 21:51:40.073 | 47 | INFO | STARTUP | <<start-node-3>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node3 | 5.829s | 2025-11-17 21:51:40.073 | 48 | INFO | STARTUP | <<start-node-3>> | InputWireChecks: | All input wires have been bound. | |
| node3 | 5.831s | 2025-11-17 21:51:40.075 | 49 | WARN | STARTUP | <<start-node-3>> | PcesFileTracker: | No preconsensus event files available | |
| node3 | 5.831s | 2025-11-17 21:51:40.075 | 50 | INFO | STARTUP | <<start-node-3>> | SwirldsPlatform: | replaying preconsensus event stream starting at 0 | |
| node3 | 5.833s | 2025-11-17 21:51:40.077 | 51 | INFO | STARTUP | <<start-node-3>> | PcesReplayer: | Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds. | |
| node3 | 5.834s | 2025-11-17 21:51:40.078 | 52 | INFO | STARTUP | <<app: appMain 3>> | ConsistencyTestingToolMain: | run called in Main. | |
| node2 | 5.835s | 2025-11-17 21:51:40.079 | 33 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node3 | 5.836s | 2025-11-17 21:51:40.080 | 53 | INFO | PLATFORM_STATUS | <platformForkJoinThread-2> | StatusStateMachine: | Platform spent 175.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node2 | 5.838s | 2025-11-17 21:51:40.082 | 34 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node3 | 5.841s | 2025-11-17 21:51:40.085 | 54 | INFO | PLATFORM_STATUS | <platformForkJoinThread-2> | StatusStateMachine: | Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node2 | 5.935s | 2025-11-17 21:51:40.179 | 35 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "InvOnA==", "port": 30124 }, { "ipAddressV4": "CoAAHQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "Ih3SfA==", "port": 30125 }, { "ipAddressV4": "CoAAIQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "iHQMHg==", "port": 30126 }, { "ipAddressV4": "CoAAHg==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "IkQRmQ==", "port": 30127 }, { "ipAddressV4": "CoAAHw==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "Ike+6w==", "port": 30128 }, { "ipAddressV4": "CoAAIA==", "port": 30128 }] }] } | |||||||||
| node2 | 5.961s | 2025-11-17 21:51:40.205 | 36 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/2/ConsistencyTestLog.csv | |
| node2 | 5.962s | 2025-11-17 21:51:40.206 | 37 | INFO | STARTUP | <main> | TransactionHandlingHistory: | No log file found. Starting without any previous history | |
| node2 | 5.975s | 2025-11-17 21:51:40.219 | 38 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 61637a9249941b41469383786ea3e9cf47d7e3c2e8c4a83b6668ae7def5f6c79683a4d924180b5192cfc05b2082b9003 (root) VirtualMap state / cluster-false-crater-habit | |||||||||
| node2 | 5.978s | 2025-11-17 21:51:40.222 | 40 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Starting the ReconnectController | |
| node2 | 6.226s | 2025-11-17 21:51:40.470 | 41 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b | |
| node2 | 6.231s | 2025-11-17 21:51:40.475 | 42 | INFO | STARTUP | <platformForkJoinThread-2> | Shadowgraph: | Shadowgraph starting from expiration threshold 1 | |
| node2 | 6.237s | 2025-11-17 21:51:40.481 | 43 | INFO | STARTUP | <<start-node-2>> | ConsistencyTestingToolMain: | init called in Main for node 2. | |
| node2 | 6.237s | 2025-11-17 21:51:40.481 | 44 | INFO | STARTUP | <<start-node-2>> | SwirldsPlatform: | Starting platform 2 | |
| node2 | 6.239s | 2025-11-17 21:51:40.483 | 45 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node2 | 6.242s | 2025-11-17 21:51:40.486 | 46 | INFO | STARTUP | <<start-node-2>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node2 | 6.243s | 2025-11-17 21:51:40.487 | 47 | INFO | STARTUP | <<start-node-2>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node2 | 6.244s | 2025-11-17 21:51:40.488 | 48 | INFO | STARTUP | <<start-node-2>> | InputWireChecks: | All input wires have been bound. | |
| node2 | 6.246s | 2025-11-17 21:51:40.490 | 49 | WARN | STARTUP | <<start-node-2>> | PcesFileTracker: | No preconsensus event files available | |
| node2 | 6.246s | 2025-11-17 21:51:40.490 | 50 | INFO | STARTUP | <<start-node-2>> | SwirldsPlatform: | replaying preconsensus event stream starting at 0 | |
| node2 | 6.248s | 2025-11-17 21:51:40.492 | 51 | INFO | STARTUP | <<start-node-2>> | PcesReplayer: | Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds. | |
| node2 | 6.249s | 2025-11-17 21:51:40.493 | 52 | INFO | STARTUP | <<app: appMain 2>> | ConsistencyTestingToolMain: | run called in Main. | |
| node2 | 6.252s | 2025-11-17 21:51:40.496 | 53 | INFO | PLATFORM_STATUS | <platformForkJoinThread-6> | StatusStateMachine: | Platform spent 205.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node2 | 6.258s | 2025-11-17 21:51:40.502 | 54 | INFO | PLATFORM_STATUS | <platformForkJoinThread-6> | StatusStateMachine: | Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node0 | 6.462s | 2025-11-17 21:51:40.706 | 31 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26161171] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=300720, randomLong=-8702631581737953231, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=15220, randomLong=-4563111112219035683, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1689350, data=35, exception=null] OS Health Check Report - Complete (took 1027 ms) | |||||||||
| node0 | 6.496s | 2025-11-17 21:51:40.740 | 32 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node0 | 6.505s | 2025-11-17 21:51:40.749 | 33 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node0 | 6.507s | 2025-11-17 21:51:40.751 | 34 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node0 | 6.595s | 2025-11-17 21:51:40.839 | 35 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "InvOnA==", "port": 30124 }, { "ipAddressV4": "CoAAHQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "Ih3SfA==", "port": 30125 }, { "ipAddressV4": "CoAAIQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "iHQMHg==", "port": 30126 }, { "ipAddressV4": "CoAAHg==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "IkQRmQ==", "port": 30127 }, { "ipAddressV4": "CoAAHw==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "Ike+6w==", "port": 30128 }, { "ipAddressV4": "CoAAIA==", "port": 30128 }] }] } | |||||||||
| node0 | 6.620s | 2025-11-17 21:51:40.864 | 36 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/0/ConsistencyTestLog.csv | |
| node0 | 6.621s | 2025-11-17 21:51:40.865 | 37 | INFO | STARTUP | <main> | TransactionHandlingHistory: | No log file found. Starting without any previous history | |
| node0 | 6.635s | 2025-11-17 21:51:40.879 | 38 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 61637a9249941b41469383786ea3e9cf47d7e3c2e8c4a83b6668ae7def5f6c79683a4d924180b5192cfc05b2082b9003 (root) VirtualMap state / cluster-false-crater-habit | |||||||||
| node0 | 6.639s | 2025-11-17 21:51:40.883 | 40 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Starting the ReconnectController | |
| node1 | 6.729s | 2025-11-17 21:51:40.973 | 31 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26080227] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=304880, randomLong=4292755600708031116, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=15450, randomLong=-3776546428223121232, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1851779, data=35, exception=null] OS Health Check Report - Complete (took 1033 ms) | |||||||||
| node1 | 6.768s | 2025-11-17 21:51:41.012 | 32 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node1 | 6.781s | 2025-11-17 21:51:41.025 | 33 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node1 | 6.784s | 2025-11-17 21:51:41.028 | 34 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node1 | 6.882s | 2025-11-17 21:51:41.126 | 35 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "InvOnA==", "port": 30124 }, { "ipAddressV4": "CoAAHQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "Ih3SfA==", "port": 30125 }, { "ipAddressV4": "CoAAIQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "iHQMHg==", "port": 30126 }, { "ipAddressV4": "CoAAHg==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "IkQRmQ==", "port": 30127 }, { "ipAddressV4": "CoAAHw==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "Ike+6w==", "port": 30128 }, { "ipAddressV4": "CoAAIA==", "port": 30128 }] }] } | |||||||||
| node0 | 6.890s | 2025-11-17 21:51:41.134 | 41 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b | |
| node0 | 6.895s | 2025-11-17 21:51:41.139 | 42 | INFO | STARTUP | <platformForkJoinThread-2> | Shadowgraph: | Shadowgraph starting from expiration threshold 1 | |
| node0 | 6.900s | 2025-11-17 21:51:41.144 | 43 | INFO | STARTUP | <<start-node-0>> | ConsistencyTestingToolMain: | init called in Main for node 0. | |
| node0 | 6.901s | 2025-11-17 21:51:41.145 | 44 | INFO | STARTUP | <<start-node-0>> | SwirldsPlatform: | Starting platform 0 | |
| node0 | 6.903s | 2025-11-17 21:51:41.147 | 45 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node0 | 6.907s | 2025-11-17 21:51:41.151 | 46 | INFO | STARTUP | <<start-node-0>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node0 | 6.908s | 2025-11-17 21:51:41.152 | 47 | INFO | STARTUP | <<start-node-0>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node0 | 6.909s | 2025-11-17 21:51:41.153 | 48 | INFO | STARTUP | <<start-node-0>> | InputWireChecks: | All input wires have been bound. | |
| node0 | 6.911s | 2025-11-17 21:51:41.155 | 49 | WARN | STARTUP | <<start-node-0>> | PcesFileTracker: | No preconsensus event files available | |
| node0 | 6.911s | 2025-11-17 21:51:41.155 | 50 | INFO | STARTUP | <<start-node-0>> | SwirldsPlatform: | replaying preconsensus event stream starting at 0 | |
| node1 | 6.911s | 2025-11-17 21:51:41.155 | 36 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/1/ConsistencyTestLog.csv | |
| node1 | 6.912s | 2025-11-17 21:51:41.156 | 37 | INFO | STARTUP | <main> | TransactionHandlingHistory: | No log file found. Starting without any previous history | |
| node0 | 6.913s | 2025-11-17 21:51:41.157 | 51 | INFO | STARTUP | <<start-node-0>> | PcesReplayer: | Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds. | |
| node0 | 6.915s | 2025-11-17 21:51:41.159 | 52 | INFO | STARTUP | <<app: appMain 0>> | ConsistencyTestingToolMain: | run called in Main. | |
| node0 | 6.917s | 2025-11-17 21:51:41.161 | 53 | INFO | PLATFORM_STATUS | <platformForkJoinThread-4> | StatusStateMachine: | Platform spent 217.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node0 | 6.922s | 2025-11-17 21:51:41.166 | 54 | INFO | PLATFORM_STATUS | <platformForkJoinThread-4> | StatusStateMachine: | Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node1 | 6.927s | 2025-11-17 21:51:41.171 | 38 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 61637a9249941b41469383786ea3e9cf47d7e3c2e8c4a83b6668ae7def5f6c79683a4d924180b5192cfc05b2082b9003 (root) VirtualMap state / cluster-false-crater-habit | |||||||||
| node1 | 6.932s | 2025-11-17 21:51:41.176 | 40 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Starting the ReconnectController | |
| node1 | 7.201s | 2025-11-17 21:51:41.445 | 41 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b | |
| node1 | 7.207s | 2025-11-17 21:51:41.451 | 42 | INFO | STARTUP | <platformForkJoinThread-2> | Shadowgraph: | Shadowgraph starting from expiration threshold 1 | |
| node1 | 7.213s | 2025-11-17 21:51:41.457 | 43 | INFO | STARTUP | <<start-node-1>> | ConsistencyTestingToolMain: | init called in Main for node 1. | |
| node1 | 7.214s | 2025-11-17 21:51:41.458 | 44 | INFO | STARTUP | <<start-node-1>> | SwirldsPlatform: | Starting platform 1 | |
| node1 | 7.215s | 2025-11-17 21:51:41.459 | 45 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node1 | 7.219s | 2025-11-17 21:51:41.463 | 46 | INFO | STARTUP | <<start-node-1>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node1 | 7.220s | 2025-11-17 21:51:41.464 | 47 | INFO | STARTUP | <<start-node-1>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node1 | 7.221s | 2025-11-17 21:51:41.465 | 48 | INFO | STARTUP | <<start-node-1>> | InputWireChecks: | All input wires have been bound. | |
| node1 | 7.222s | 2025-11-17 21:51:41.466 | 49 | WARN | STARTUP | <<start-node-1>> | PcesFileTracker: | No preconsensus event files available | |
| node1 | 7.223s | 2025-11-17 21:51:41.467 | 50 | INFO | STARTUP | <<start-node-1>> | SwirldsPlatform: | replaying preconsensus event stream starting at 0 | |
| node1 | 7.224s | 2025-11-17 21:51:41.468 | 51 | INFO | STARTUP | <<start-node-1>> | PcesReplayer: | Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds. | |
| node1 | 7.225s | 2025-11-17 21:51:41.469 | 52 | INFO | STARTUP | <<app: appMain 1>> | ConsistencyTestingToolMain: | run called in Main. | |
| node1 | 7.230s | 2025-11-17 21:51:41.474 | 53 | INFO | PLATFORM_STATUS | <platformForkJoinThread-1> | StatusStateMachine: | Platform spent 229.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node1 | 7.236s | 2025-11-17 21:51:41.480 | 54 | INFO | PLATFORM_STATUS | <platformForkJoinThread-1> | StatusStateMachine: | Platform spent 5.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node4 | 7.857s | 2025-11-17 21:51:42.101 | 55 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ] | |
| node4 | 7.860s | 2025-11-17 21:51:42.104 | 56 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node3 | 8.837s | 2025-11-17 21:51:43.081 | 55 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting3.csv' ] | |
| node3 | 8.841s | 2025-11-17 21:51:43.085 | 56 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node2 | 9.251s | 2025-11-17 21:51:43.495 | 55 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting2.csv' ] | |
| node2 | 9.255s | 2025-11-17 21:51:43.499 | 56 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node0 | 9.915s | 2025-11-17 21:51:44.159 | 55 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting0.csv' ] | |
| node0 | 9.918s | 2025-11-17 21:51:44.162 | 56 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node1 | 10.228s | 2025-11-17 21:51:44.472 | 55 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting1.csv' ] | |
| node1 | 10.231s | 2025-11-17 21:51:44.475 | 56 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node4 | 14.954s | 2025-11-17 21:51:49.198 | 57 | INFO | PLATFORM_STATUS | <platformForkJoinThread-5> | StatusStateMachine: | Platform spent 10.1 s in OBSERVING. Now in CHECKING | |
| node3 | 15.930s | 2025-11-17 21:51:50.174 | 57 | INFO | PLATFORM_STATUS | <platformForkJoinThread-3> | StatusStateMachine: | Platform spent 10.1 s in OBSERVING. Now in CHECKING | |
| node2 | 16.344s | 2025-11-17 21:51:50.588 | 57 | INFO | PLATFORM_STATUS | <platformForkJoinThread-7> | StatusStateMachine: | Platform spent 10.1 s in OBSERVING. Now in CHECKING | |
| node0 | 17.010s | 2025-11-17 21:51:51.254 | 57 | INFO | PLATFORM_STATUS | <platformForkJoinThread-5> | StatusStateMachine: | Platform spent 10.1 s in OBSERVING. Now in CHECKING | |
| node1 | 17.322s | 2025-11-17 21:51:51.566 | 57 | INFO | PLATFORM_STATUS | <platformForkJoinThread-2> | StatusStateMachine: | Platform spent 10.1 s in OBSERVING. Now in CHECKING | |
| node1 | 17.850s | 2025-11-17 21:51:52.094 | 59 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS | |
| node3 | 17.877s | 2025-11-17 21:51:52.121 | 59 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS | |
| node2 | 17.913s | 2025-11-17 21:51:52.157 | 59 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS | |
| node4 | 17.919s | 2025-11-17 21:51:52.163 | 58 | INFO | PLATFORM_STATUS | <platformForkJoinThread-5> | StatusStateMachine: | Platform spent 3.0 s in CHECKING. Now in ACTIVE | |
| node4 | 17.921s | 2025-11-17 21:51:52.165 | 60 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS | |
| node0 | 17.960s | 2025-11-17 21:51:52.204 | 59 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS | |
| node2 | 18.028s | 2025-11-17 21:51:52.272 | 74 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1 | |
| node2 | 18.030s | 2025-11-17 21:51:52.274 | 75 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@784e3dc6 | |
| node1 | 18.114s | 2025-11-17 21:51:52.358 | 74 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1 | |
| node1 | 18.116s | 2025-11-17 21:51:52.360 | 75 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@784e3dc6 | |
| node3 | 18.201s | 2025-11-17 21:51:52.445 | 74 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1 | |
| node3 | 18.203s | 2025-11-17 21:51:52.447 | 75 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@7feb30ec | |
| node4 | 18.231s | 2025-11-17 21:51:52.475 | 75 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 | |
| node4 | 18.233s | 2025-11-17 21:51:52.477 | 76 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@539341cc | |
| node0 | 18.242s | 2025-11-17 21:51:52.486 | 74 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1 | |
| node0 | 18.245s | 2025-11-17 21:51:52.489 | 75 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@3bd0dc15 | |
| node2 | 18.274s | 2025-11-17 21:51:52.518 | 108 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@784e3dc6 | |
| node2 | 18.277s | 2025-11-17 21:51:52.521 | 109 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 1 Timestamp: 2025-11-17T21:51:50.540538219Z Next consensus number: 1 Legacy running event hash: cf68f4473a810eaa66c75c1c8c42f634e1e019f84eca785a32241092e9026862d153e9b488877fa8adf4e5681eaacb4e Legacy running event mnemonic: basic-jaguar-expand-stem Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 4ae9683e7437aca6e59b4272ab392eebb08a80cb627433de7b4c0c407ba1387c034932bcfc4206b5173851484ebb201a (root) VirtualMap state / frog-amazing-manage-hero | |||||||||
| node3 | 18.298s | 2025-11-17 21:51:52.542 | 90 | INFO | PLATFORM_STATUS | <platformForkJoinThread-1> | StatusStateMachine: | Platform spent 2.4 s in CHECKING. Now in ACTIVE | |
| node0 | 18.319s | 2025-11-17 21:51:52.563 | 90 | INFO | PLATFORM_STATUS | <platformForkJoinThread-4> | StatusStateMachine: | Platform spent 1.3 s in CHECKING. Now in ACTIVE | |
| node2 | 18.324s | 2025-11-17 21:51:52.568 | 110 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+51+49.432009215Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 18.325s | 2025-11-17 21:51:52.569 | 111 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+51+49.432009215Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 18.325s | 2025-11-17 21:51:52.569 | 112 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 18.327s | 2025-11-17 21:51:52.571 | 113 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 18.334s | 2025-11-17 21:51:52.578 | 114 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 18.335s | 2025-11-17 21:51:52.579 | 116 | INFO | PLATFORM_STATUS | <platformForkJoinThread-5> | StatusStateMachine: | Platform spent 2.0 s in CHECKING. Now in ACTIVE | |
| node1 | 18.382s | 2025-11-17 21:51:52.626 | 108 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@784e3dc6 | |
| node1 | 18.386s | 2025-11-17 21:51:52.630 | 109 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 1 Timestamp: 2025-11-17T21:51:50.540538219Z Next consensus number: 1 Legacy running event hash: cf68f4473a810eaa66c75c1c8c42f634e1e019f84eca785a32241092e9026862d153e9b488877fa8adf4e5681eaacb4e Legacy running event mnemonic: basic-jaguar-expand-stem Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 4ae9683e7437aca6e59b4272ab392eebb08a80cb627433de7b4c0c407ba1387c034932bcfc4206b5173851484ebb201a (root) VirtualMap state / frog-amazing-manage-hero | |||||||||
| node1 | 18.430s | 2025-11-17 21:51:52.674 | 110 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+51+49.359203665Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 18.430s | 2025-11-17 21:51:52.674 | 111 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+51+49.359203665Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 18.431s | 2025-11-17 21:51:52.675 | 112 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 18.432s | 2025-11-17 21:51:52.676 | 113 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 18.438s | 2025-11-17 21:51:52.682 | 114 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 18.454s | 2025-11-17 21:51:52.698 | 108 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@7feb30ec | |
| node3 | 18.457s | 2025-11-17 21:51:52.701 | 109 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 1 Timestamp: 2025-11-17T21:51:50.540538219Z Next consensus number: 1 Legacy running event hash: cf68f4473a810eaa66c75c1c8c42f634e1e019f84eca785a32241092e9026862d153e9b488877fa8adf4e5681eaacb4e Legacy running event mnemonic: basic-jaguar-expand-stem Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 4ae9683e7437aca6e59b4272ab392eebb08a80cb627433de7b4c0c407ba1387c034932bcfc4206b5173851484ebb201a (root) VirtualMap state / frog-amazing-manage-hero | |||||||||
| node4 | 18.477s | 2025-11-17 21:51:52.721 | 110 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@539341cc | |
| node4 | 18.480s | 2025-11-17 21:51:52.724 | 111 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 1 Timestamp: 2025-11-17T21:51:50.540538219Z Next consensus number: 1 Legacy running event hash: cf68f4473a810eaa66c75c1c8c42f634e1e019f84eca785a32241092e9026862d153e9b488877fa8adf4e5681eaacb4e Legacy running event mnemonic: basic-jaguar-expand-stem Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 4ae9683e7437aca6e59b4272ab392eebb08a80cb627433de7b4c0c407ba1387c034932bcfc4206b5173851484ebb201a (root) VirtualMap state / frog-amazing-manage-hero | |||||||||
| node3 | 18.500s | 2025-11-17 21:51:52.744 | 110 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+51+49.340458921Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 18.501s | 2025-11-17 21:51:52.745 | 110 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for com.swirlds.demo.consistency.ConsistencyTestingToolState@3bd0dc15 | |
| node3 | 18.501s | 2025-11-17 21:51:52.745 | 111 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+51+49.340458921Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 18.501s | 2025-11-17 21:51:52.745 | 112 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 18.502s | 2025-11-17 21:51:52.746 | 113 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 18.505s | 2025-11-17 21:51:52.749 | 111 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 1 Timestamp: 2025-11-17T21:51:50.540538219Z Next consensus number: 1 Legacy running event hash: cf68f4473a810eaa66c75c1c8c42f634e1e019f84eca785a32241092e9026862d153e9b488877fa8adf4e5681eaacb4e Legacy running event mnemonic: basic-jaguar-expand-stem Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 4ae9683e7437aca6e59b4272ab392eebb08a80cb627433de7b4c0c407ba1387c034932bcfc4206b5173851484ebb201a (root) VirtualMap state / frog-amazing-manage-hero | |||||||||
| node3 | 18.508s | 2025-11-17 21:51:52.752 | 114 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 18.524s | 2025-11-17 21:51:52.768 | 112 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+51+49.234736072Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 18.524s | 2025-11-17 21:51:52.768 | 113 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+51+49.234736072Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 18.524s | 2025-11-17 21:51:52.768 | 114 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 18.526s | 2025-11-17 21:51:52.770 | 115 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 18.532s | 2025-11-17 21:51:52.776 | 116 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 18.548s | 2025-11-17 21:51:52.792 | 112 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+51+49.530997129Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 18.549s | 2025-11-17 21:51:52.793 | 113 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+51+49.530997129Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 18.549s | 2025-11-17 21:51:52.793 | 114 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 18.550s | 2025-11-17 21:51:52.794 | 115 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 18.556s | 2025-11-17 21:51:52.800 | 116 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 18.770s | 2025-11-17 21:51:53.014 | 124 | INFO | PLATFORM_STATUS | <platformForkJoinThread-1> | StatusStateMachine: | Platform spent 1.4 s in CHECKING. Now in ACTIVE | |
| node2 | 26.781s | 2025-11-17 21:52:01.025 | 303 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 21 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node4 | 26.818s | 2025-11-17 21:52:01.062 | 304 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 21 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node0 | 26.836s | 2025-11-17 21:52:01.080 | 310 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 21 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 26.839s | 2025-11-17 21:52:01.083 | 307 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 21 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 26.926s | 2025-11-17 21:52:01.170 | 308 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 21 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 27.006s | 2025-11-17 21:52:01.250 | 309 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 21 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/21 | |
| node1 | 27.007s | 2025-11-17 21:52:01.251 | 310 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@22f45e83 | |
| node3 | 27.039s | 2025-11-17 21:52:01.283 | 310 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 21 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/21 | |
| node3 | 27.039s | 2025-11-17 21:52:01.283 | 311 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@25d5aea6 | |
| node1 | 27.101s | 2025-11-17 21:52:01.345 | 349 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@22f45e83 | |
| node1 | 27.104s | 2025-11-17 21:52:01.348 | 350 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 21 Timestamp: 2025-11-17T21:52:00.118225Z Next consensus number: 677 Legacy running event hash: fcdb84768fd98e09848f71672254dd81ed43d804b444c991232df926e0854cd2014e82edc2015a393966ba1bbb870d4c Legacy running event mnemonic: gorilla-image-agent-discover Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -234925819 Root hash: 026b0da78e806b77529bad6781b5a4e7bee2ccf72f31f68f94df1717a4821f0c6623763b6133bbc6cb79b107c96bbf15 (root) VirtualMap state / estate-reject-copy-humble | |||||||||
| node1 | 27.114s | 2025-11-17 21:52:01.358 | 351 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+51+49.359203665Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 27.115s | 2025-11-17 21:52:01.359 | 312 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 21 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/21 | |
| node1 | 27.115s | 2025-11-17 21:52:01.359 | 352 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+51+49.359203665Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 27.116s | 2025-11-17 21:52:01.360 | 313 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@55af004a | |
| node1 | 27.116s | 2025-11-17 21:52:01.360 | 353 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 27.117s | 2025-11-17 21:52:01.361 | 354 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 27.117s | 2025-11-17 21:52:01.361 | 355 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 21 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/21 {"round":21,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/21/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 27.130s | 2025-11-17 21:52:01.374 | 342 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@25d5aea6 | |
| node3 | 27.133s | 2025-11-17 21:52:01.377 | 343 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 21 Timestamp: 2025-11-17T21:52:00.118225Z Next consensus number: 677 Legacy running event hash: fcdb84768fd98e09848f71672254dd81ed43d804b444c991232df926e0854cd2014e82edc2015a393966ba1bbb870d4c Legacy running event mnemonic: gorilla-image-agent-discover Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -234925819 Root hash: 026b0da78e806b77529bad6781b5a4e7bee2ccf72f31f68f94df1717a4821f0c6623763b6133bbc6cb79b107c96bbf15 (root) VirtualMap state / estate-reject-copy-humble | |||||||||
| node3 | 27.141s | 2025-11-17 21:52:01.385 | 344 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+51+49.340458921Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 27.141s | 2025-11-17 21:52:01.385 | 345 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+51+49.340458921Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 27.142s | 2025-11-17 21:52:01.386 | 346 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 27.143s | 2025-11-17 21:52:01.387 | 347 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 27.144s | 2025-11-17 21:52:01.388 | 348 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 21 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/21 {"round":21,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/21/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 27.149s | 2025-11-17 21:52:01.393 | 315 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 21 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/21 | |
| node2 | 27.150s | 2025-11-17 21:52:01.394 | 316 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@4ede5b9a | |
| node4 | 27.169s | 2025-11-17 21:52:01.413 | 316 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 21 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/21 | |
| node4 | 27.170s | 2025-11-17 21:52:01.414 | 317 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@b42e59c | |
| node0 | 27.205s | 2025-11-17 21:52:01.449 | 344 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@55af004a | |
| node0 | 27.208s | 2025-11-17 21:52:01.452 | 345 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 21 Timestamp: 2025-11-17T21:52:00.118225Z Next consensus number: 677 Legacy running event hash: fcdb84768fd98e09848f71672254dd81ed43d804b444c991232df926e0854cd2014e82edc2015a393966ba1bbb870d4c Legacy running event mnemonic: gorilla-image-agent-discover Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -234925819 Root hash: 026b0da78e806b77529bad6781b5a4e7bee2ccf72f31f68f94df1717a4821f0c6623763b6133bbc6cb79b107c96bbf15 (root) VirtualMap state / estate-reject-copy-humble | |||||||||
| node0 | 27.218s | 2025-11-17 21:52:01.462 | 346 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+51+49.530997129Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 27.218s | 2025-11-17 21:52:01.462 | 347 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+51+49.530997129Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 27.219s | 2025-11-17 21:52:01.463 | 348 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 27.220s | 2025-11-17 21:52:01.464 | 349 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 27.220s | 2025-11-17 21:52:01.464 | 350 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 21 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/21 {"round":21,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/21/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 27.239s | 2025-11-17 21:52:01.483 | 347 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@4ede5b9a | |
| node2 | 27.242s | 2025-11-17 21:52:01.486 | 348 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 21 Timestamp: 2025-11-17T21:52:00.118225Z Next consensus number: 677 Legacy running event hash: fcdb84768fd98e09848f71672254dd81ed43d804b444c991232df926e0854cd2014e82edc2015a393966ba1bbb870d4c Legacy running event mnemonic: gorilla-image-agent-discover Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -234925819 Root hash: 026b0da78e806b77529bad6781b5a4e7bee2ccf72f31f68f94df1717a4821f0c6623763b6133bbc6cb79b107c96bbf15 (root) VirtualMap state / estate-reject-copy-humble | |||||||||
| node2 | 27.250s | 2025-11-17 21:52:01.494 | 357 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+51+49.432009215Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 27.250s | 2025-11-17 21:52:01.494 | 358 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+51+49.432009215Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 27.251s | 2025-11-17 21:52:01.495 | 359 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 27.252s | 2025-11-17 21:52:01.496 | 360 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 27.253s | 2025-11-17 21:52:01.497 | 361 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 21 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/21 {"round":21,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/21/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 27.255s | 2025-11-17 21:52:01.499 | 356 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for com.swirlds.demo.consistency.ConsistencyTestingToolState@b42e59c | |
| node4 | 27.257s | 2025-11-17 21:52:01.501 | 357 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 21 Timestamp: 2025-11-17T21:52:00.118225Z Next consensus number: 677 Legacy running event hash: fcdb84768fd98e09848f71672254dd81ed43d804b444c991232df926e0854cd2014e82edc2015a393966ba1bbb870d4c Legacy running event mnemonic: gorilla-image-agent-discover Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -234925819 Root hash: 026b0da78e806b77529bad6781b5a4e7bee2ccf72f31f68f94df1717a4821f0c6623763b6133bbc6cb79b107c96bbf15 (root) VirtualMap state / estate-reject-copy-humble | |||||||||
| node4 | 27.265s | 2025-11-17 21:52:01.509 | 358 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+51+49.234736072Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 27.265s | 2025-11-17 21:52:01.509 | 359 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 1 File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+51+49.234736072Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 27.266s | 2025-11-17 21:52:01.510 | 360 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 27.267s | 2025-11-17 21:52:01.511 | 361 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 27.268s | 2025-11-17 21:52:01.512 | 362 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 21 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/21 {"round":21,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/21/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 1m 27.025s | 2025-11-17 21:53:01.269 | 1832 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 155 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node0 | 1m 27.026s | 2025-11-17 21:53:01.270 | 1814 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 155 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node4 | 1m 27.047s | 2025-11-17 21:53:01.291 | 1826 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 155 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 1m 27.057s | 2025-11-17 21:53:01.301 | 1817 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 155 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 1m 27.113s | 2025-11-17 21:53:01.357 | 1853 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 155 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 1m 27.244s | 2025-11-17 21:53:01.488 | 1835 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 155 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/155 | |
| node3 | 1m 27.245s | 2025-11-17 21:53:01.489 | 1836 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@6f021122 | |
| node2 | 1m 27.259s | 2025-11-17 21:53:01.503 | 1856 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 155 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/155 | |
| node2 | 1m 27.260s | 2025-11-17 21:53:01.504 | 1857 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@76da1da9 | |
| node1 | 1m 27.315s | 2025-11-17 21:53:01.559 | 1820 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 155 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/155 | |
| node1 | 1m 27.316s | 2025-11-17 21:53:01.560 | 1821 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@7e360f96 | |
| node3 | 1m 27.323s | 2025-11-17 21:53:01.567 | 1867 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@6f021122 | |
| node3 | 1m 27.325s | 2025-11-17 21:53:01.569 | 1868 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 155 Timestamp: 2025-11-17T21:53:00.302796Z Next consensus number: 5491 Legacy running event hash: fda85a16bae1e9fd782f25dc49714d13d28899fe79215e9a1b0ef2e45f0f56e677f6ba3dd1731b870111c57dccdf2ff0 Legacy running event mnemonic: enrich-drift-physical-scatter Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 882889364 Root hash: 1d4349dd37273a88d2678d7618ea3e7bf20b2f7932afed9e1785bedfbdafcb3433b1a6491808b670494f0fff7ccc278e (root) VirtualMap state / position-desert-top-extra | |||||||||
| node3 | 1m 27.335s | 2025-11-17 21:53:01.579 | 1869 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+51+49.340458921Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 1m 27.335s | 2025-11-17 21:53:01.579 | 1870 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 127 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+51+49.340458921Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 1m 27.335s | 2025-11-17 21:53:01.579 | 1871 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 1m 27.337s | 2025-11-17 21:53:01.581 | 1888 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@76da1da9 | |
| node2 | 1m 27.339s | 2025-11-17 21:53:01.583 | 1889 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 155 Timestamp: 2025-11-17T21:53:00.302796Z Next consensus number: 5491 Legacy running event hash: fda85a16bae1e9fd782f25dc49714d13d28899fe79215e9a1b0ef2e45f0f56e677f6ba3dd1731b870111c57dccdf2ff0 Legacy running event mnemonic: enrich-drift-physical-scatter Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 882889364 Root hash: 1d4349dd37273a88d2678d7618ea3e7bf20b2f7932afed9e1785bedfbdafcb3433b1a6491808b670494f0fff7ccc278e (root) VirtualMap state / position-desert-top-extra | |||||||||
| node3 | 1m 27.339s | 2025-11-17 21:53:01.583 | 1872 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 1m 27.340s | 2025-11-17 21:53:01.584 | 1873 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 155 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/155 {"round":155,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/155/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 1m 27.347s | 2025-11-17 21:53:01.591 | 1890 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+51+49.432009215Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 1m 27.347s | 2025-11-17 21:53:01.591 | 1891 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 127 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+51+49.432009215Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 1m 27.348s | 2025-11-17 21:53:01.592 | 1892 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 1m 27.352s | 2025-11-17 21:53:01.596 | 1893 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 1m 27.352s | 2025-11-17 21:53:01.596 | 1894 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 155 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/155 {"round":155,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/155/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 1m 27.391s | 2025-11-17 21:53:01.635 | 1817 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 155 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/155 | |
| node0 | 1m 27.392s | 2025-11-17 21:53:01.636 | 1818 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@5dff94 | |
| node4 | 1m 27.399s | 2025-11-17 21:53:01.643 | 1839 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 155 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/155 | |
| node4 | 1m 27.400s | 2025-11-17 21:53:01.644 | 1840 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@2f19f33e | |
| node1 | 1m 27.403s | 2025-11-17 21:53:01.647 | 1852 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@7e360f96 | |
| node1 | 1m 27.405s | 2025-11-17 21:53:01.649 | 1853 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 155 Timestamp: 2025-11-17T21:53:00.302796Z Next consensus number: 5491 Legacy running event hash: fda85a16bae1e9fd782f25dc49714d13d28899fe79215e9a1b0ef2e45f0f56e677f6ba3dd1731b870111c57dccdf2ff0 Legacy running event mnemonic: enrich-drift-physical-scatter Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 882889364 Root hash: 1d4349dd37273a88d2678d7618ea3e7bf20b2f7932afed9e1785bedfbdafcb3433b1a6491808b670494f0fff7ccc278e (root) VirtualMap state / position-desert-top-extra | |||||||||
| node1 | 1m 27.414s | 2025-11-17 21:53:01.658 | 1854 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+51+49.359203665Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 1m 27.414s | 2025-11-17 21:53:01.658 | 1855 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 127 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+51+49.359203665Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 1m 27.414s | 2025-11-17 21:53:01.658 | 1856 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 1m 27.418s | 2025-11-17 21:53:01.662 | 1857 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 1m 27.419s | 2025-11-17 21:53:01.663 | 1858 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 155 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/155 {"round":155,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/155/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 1m 27.475s | 2025-11-17 21:53:01.719 | 1879 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@2f19f33e | |
| node4 | 1m 27.477s | 2025-11-17 21:53:01.721 | 1880 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 155 Timestamp: 2025-11-17T21:53:00.302796Z Next consensus number: 5491 Legacy running event hash: fda85a16bae1e9fd782f25dc49714d13d28899fe79215e9a1b0ef2e45f0f56e677f6ba3dd1731b870111c57dccdf2ff0 Legacy running event mnemonic: enrich-drift-physical-scatter Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 882889364 Root hash: 1d4349dd37273a88d2678d7618ea3e7bf20b2f7932afed9e1785bedfbdafcb3433b1a6491808b670494f0fff7ccc278e (root) VirtualMap state / position-desert-top-extra | |||||||||
| node0 | 1m 27.485s | 2025-11-17 21:53:01.729 | 1849 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for com.swirlds.demo.consistency.ConsistencyTestingToolState@5dff94 | |
| node4 | 1m 27.486s | 2025-11-17 21:53:01.730 | 1881 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+51+49.234736072Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 1m 27.486s | 2025-11-17 21:53:01.730 | 1882 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 127 File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+51+49.234736072Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 1m 27.486s | 2025-11-17 21:53:01.730 | 1883 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 1m 27.488s | 2025-11-17 21:53:01.732 | 1850 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 155 Timestamp: 2025-11-17T21:53:00.302796Z Next consensus number: 5491 Legacy running event hash: fda85a16bae1e9fd782f25dc49714d13d28899fe79215e9a1b0ef2e45f0f56e677f6ba3dd1731b870111c57dccdf2ff0 Legacy running event mnemonic: enrich-drift-physical-scatter Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 882889364 Root hash: 1d4349dd37273a88d2678d7618ea3e7bf20b2f7932afed9e1785bedfbdafcb3433b1a6491808b670494f0fff7ccc278e (root) VirtualMap state / position-desert-top-extra | |||||||||
| node4 | 1m 27.490s | 2025-11-17 21:53:01.734 | 1884 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 1m 27.491s | 2025-11-17 21:53:01.735 | 1885 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 155 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/155 {"round":155,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/155/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 1m 27.501s | 2025-11-17 21:53:01.745 | 1851 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+51+49.530997129Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 1m 27.502s | 2025-11-17 21:53:01.746 | 1852 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 127 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+51+49.530997129Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 1m 27.502s | 2025-11-17 21:53:01.746 | 1853 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 1m 27.506s | 2025-11-17 21:53:01.750 | 1854 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 1m 27.507s | 2025-11-17 21:53:01.751 | 1855 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 155 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/155 {"round":155,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/155/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 2m 26.984s | 2025-11-17 21:54:01.228 | 3298 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 289 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 2m 27.057s | 2025-11-17 21:54:01.301 | 3331 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 289 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 2m 27.065s | 2025-11-17 21:54:01.309 | 3338 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 289 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 2m 27.090s | 2025-11-17 21:54:01.334 | 3361 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 289 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node4 | 2m 27.129s | 2025-11-17 21:54:01.373 | 3350 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 289 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 2m 27.200s | 2025-11-17 21:54:01.444 | 3341 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 289 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/289 | |
| node3 | 2m 27.201s | 2025-11-17 21:54:01.445 | 3342 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@615f8396 | |
| node1 | 2m 27.239s | 2025-11-17 21:54:01.483 | 3334 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 289 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/289 | |
| node1 | 2m 27.240s | 2025-11-17 21:54:01.484 | 3335 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@4b2da925 | |
| node3 | 2m 27.288s | 2025-11-17 21:54:01.532 | 3381 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@615f8396 | |
| node3 | 2m 27.291s | 2025-11-17 21:54:01.535 | 3382 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 289 Timestamp: 2025-11-17T21:54:00.431712425Z Next consensus number: 10264 Legacy running event hash: 5830c815b8bc7a1a17b6c3399345755c7a82ba9cd852062f27d64f0813ffefeb3451a5133de5b82a8810b35875d9a942 Legacy running event mnemonic: alcohol-word-trigger-drive Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -445431835 Root hash: 7d01d117598d32e09001027345fa0af7a7c81a9e6ea46435bf6df5eb58114fa31188f2837dbba7f78d4c7a0c160c4d98 (root) VirtualMap state / parent-tag-coyote-foster | |||||||||
| node3 | 2m 27.299s | 2025-11-17 21:54:01.543 | 3383 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+51+49.340458921Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 2m 27.300s | 2025-11-17 21:54:01.544 | 3384 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 262 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+51+49.340458921Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 2m 27.300s | 2025-11-17 21:54:01.544 | 3385 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 2m 27.305s | 2025-11-17 21:54:01.549 | 3353 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 289 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/289 | |
| node4 | 2m 27.306s | 2025-11-17 21:54:01.550 | 3354 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@49a3b3bb | |
| node3 | 2m 27.307s | 2025-11-17 21:54:01.551 | 3386 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 2m 27.308s | 2025-11-17 21:54:01.552 | 3387 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 289 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/289 {"round":289,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/289/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 2m 27.309s | 2025-11-17 21:54:01.553 | 3364 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 289 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/289 | |
| node2 | 2m 27.309s | 2025-11-17 21:54:01.553 | 3365 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@6fe1b7b9 | |
| node1 | 2m 27.327s | 2025-11-17 21:54:01.571 | 3366 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@4b2da925 | |
| node1 | 2m 27.329s | 2025-11-17 21:54:01.573 | 3367 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 289 Timestamp: 2025-11-17T21:54:00.431712425Z Next consensus number: 10264 Legacy running event hash: 5830c815b8bc7a1a17b6c3399345755c7a82ba9cd852062f27d64f0813ffefeb3451a5133de5b82a8810b35875d9a942 Legacy running event mnemonic: alcohol-word-trigger-drive Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -445431835 Root hash: 7d01d117598d32e09001027345fa0af7a7c81a9e6ea46435bf6df5eb58114fa31188f2837dbba7f78d4c7a0c160c4d98 (root) VirtualMap state / parent-tag-coyote-foster | |||||||||
| node1 | 2m 27.336s | 2025-11-17 21:54:01.580 | 3368 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+51+49.359203665Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 2m 27.336s | 2025-11-17 21:54:01.580 | 3369 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 262 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+51+49.359203665Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 2m 27.337s | 2025-11-17 21:54:01.581 | 3370 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 2m 27.344s | 2025-11-17 21:54:01.588 | 3371 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 2m 27.345s | 2025-11-17 21:54:01.589 | 3372 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 289 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/289 {"round":289,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/289/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 2m 27.347s | 2025-11-17 21:54:01.591 | 3339 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 289 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/289 | |
| node0 | 2m 27.348s | 2025-11-17 21:54:01.592 | 3340 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@75f90e1d | |
| node4 | 2m 27.384s | 2025-11-17 21:54:01.628 | 3385 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@49a3b3bb | |
| node4 | 2m 27.386s | 2025-11-17 21:54:01.630 | 3386 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 289 Timestamp: 2025-11-17T21:54:00.431712425Z Next consensus number: 10264 Legacy running event hash: 5830c815b8bc7a1a17b6c3399345755c7a82ba9cd852062f27d64f0813ffefeb3451a5133de5b82a8810b35875d9a942 Legacy running event mnemonic: alcohol-word-trigger-drive Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -445431835 Root hash: 7d01d117598d32e09001027345fa0af7a7c81a9e6ea46435bf6df5eb58114fa31188f2837dbba7f78d4c7a0c160c4d98 (root) VirtualMap state / parent-tag-coyote-foster | |||||||||
| node2 | 2m 27.390s | 2025-11-17 21:54:01.634 | 3396 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@6fe1b7b9 | |
| node2 | 2m 27.392s | 2025-11-17 21:54:01.636 | 3397 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 289 Timestamp: 2025-11-17T21:54:00.431712425Z Next consensus number: 10264 Legacy running event hash: 5830c815b8bc7a1a17b6c3399345755c7a82ba9cd852062f27d64f0813ffefeb3451a5133de5b82a8810b35875d9a942 Legacy running event mnemonic: alcohol-word-trigger-drive Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -445431835 Root hash: 7d01d117598d32e09001027345fa0af7a7c81a9e6ea46435bf6df5eb58114fa31188f2837dbba7f78d4c7a0c160c4d98 (root) VirtualMap state / parent-tag-coyote-foster | |||||||||
| node4 | 2m 27.393s | 2025-11-17 21:54:01.637 | 3387 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+51+49.234736072Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 2m 27.393s | 2025-11-17 21:54:01.637 | 3388 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 262 File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+51+49.234736072Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node4 | 2m 27.394s | 2025-11-17 21:54:01.638 | 3389 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 2m 27.399s | 2025-11-17 21:54:01.643 | 3398 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+51+49.432009215Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 2m 27.400s | 2025-11-17 21:54:01.644 | 3399 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 262 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+51+49.432009215Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 2m 27.400s | 2025-11-17 21:54:01.644 | 3400 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 2m 27.401s | 2025-11-17 21:54:01.645 | 3390 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 2m 27.401s | 2025-11-17 21:54:01.645 | 3391 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 289 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/289 {"round":289,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/289/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 2m 27.407s | 2025-11-17 21:54:01.651 | 3401 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 2m 27.408s | 2025-11-17 21:54:01.652 | 3402 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 289 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/289 {"round":289,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/289/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 2m 27.434s | 2025-11-17 21:54:01.678 | 3374 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for com.swirlds.demo.consistency.ConsistencyTestingToolState@75f90e1d | |
| node0 | 2m 27.437s | 2025-11-17 21:54:01.681 | 3375 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 289 Timestamp: 2025-11-17T21:54:00.431712425Z Next consensus number: 10264 Legacy running event hash: 5830c815b8bc7a1a17b6c3399345755c7a82ba9cd852062f27d64f0813ffefeb3451a5133de5b82a8810b35875d9a942 Legacy running event mnemonic: alcohol-word-trigger-drive Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -445431835 Root hash: 7d01d117598d32e09001027345fa0af7a7c81a9e6ea46435bf6df5eb58114fa31188f2837dbba7f78d4c7a0c160c4d98 (root) VirtualMap state / parent-tag-coyote-foster | |||||||||
| node0 | 2m 27.446s | 2025-11-17 21:54:01.690 | 3376 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+51+49.530997129Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 2m 27.447s | 2025-11-17 21:54:01.691 | 3377 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 262 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+51+49.530997129Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 2m 27.447s | 2025-11-17 21:54:01.691 | 3378 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 2m 27.455s | 2025-11-17 21:54:01.699 | 3379 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 2m 27.455s | 2025-11-17 21:54:01.699 | 3380 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 289 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/289 {"round":289,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/289/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 3m 13.615s | 2025-11-17 21:54:47.859 | 4550 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 1 to 4>> | NetworkUtils: | Connection broken: 1 -> 4 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:54:47.857578009Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node3 | 3m 13.615s | 2025-11-17 21:54:47.859 | 4537 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 3 to 4>> | NetworkUtils: | Connection broken: 3 -> 4 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:54:47.858056686Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node0 | 3m 13.618s | 2025-11-17 21:54:47.862 | 4577 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 0 to 4>> | NetworkUtils: | Connection broken: 0 -> 4 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:54:47.857715342Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node2 | 3m 13.618s | 2025-11-17 21:54:47.862 | 4568 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 2 to 4>> | NetworkUtils: | Connection broken: 2 -> 4 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:54:47.858239008Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node0 | 3m 26.864s | 2025-11-17 21:55:01.108 | 4965 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 427 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 3m 26.989s | 2025-11-17 21:55:01.233 | 4920 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 427 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 3m 26.997s | 2025-11-17 21:55:01.241 | 4902 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 427 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 3m 27.004s | 2025-11-17 21:55:01.248 | 4889 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 427 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 3m 27.143s | 2025-11-17 21:55:01.387 | 4892 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 427 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/427 | |
| node3 | 3m 27.143s | 2025-11-17 21:55:01.387 | 4893 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for com.swirlds.demo.consistency.ConsistencyTestingToolState@7520ae4 | |
| node2 | 3m 27.194s | 2025-11-17 21:55:01.438 | 4923 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 427 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/427 | |
| node2 | 3m 27.195s | 2025-11-17 21:55:01.439 | 4924 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for com.swirlds.demo.consistency.ConsistencyTestingToolState@72ddbb7d | |
| node1 | 3m 27.214s | 2025-11-17 21:55:01.458 | 4905 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 427 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/427 | |
| node1 | 3m 27.215s | 2025-11-17 21:55:01.459 | 4906 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for com.swirlds.demo.consistency.ConsistencyTestingToolState@4935d43f | |
| node3 | 3m 27.222s | 2025-11-17 21:55:01.466 | 4924 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for com.swirlds.demo.consistency.ConsistencyTestingToolState@7520ae4 | |
| node3 | 3m 27.225s | 2025-11-17 21:55:01.469 | 4925 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 427 Timestamp: 2025-11-17T21:55:00.283498Z Next consensus number: 14747 Legacy running event hash: f59a22f7e47035c13187862976269b822864c76177140e2d10ec37a8d59229b0006cb609865f335284da624e499dfdba Legacy running event mnemonic: error-galaxy-improve-return Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1348217637 Root hash: c2497afd6452d40e94ffda8b4c9620fc4016a6ae357e9f390c8b0bbcba8dc5cdcd9f72ad9abb693112200599c952e58e (root) VirtualMap state / report-garden-medal-layer | |||||||||
| node3 | 3m 27.233s | 2025-11-17 21:55:01.477 | 4934 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+51+49.340458921Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 3m 27.233s | 2025-11-17 21:55:01.477 | 4935 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 400 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+51+49.340458921Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node3 | 3m 27.233s | 2025-11-17 21:55:01.477 | 4936 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 3m 27.243s | 2025-11-17 21:55:01.487 | 4937 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 3m 27.244s | 2025-11-17 21:55:01.488 | 4938 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 427 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/427 {"round":427,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/427/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 3m 27.292s | 2025-11-17 21:55:01.536 | 4978 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 427 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/427 | |
| node0 | 3m 27.293s | 2025-11-17 21:55:01.537 | 4979 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for com.swirlds.demo.consistency.ConsistencyTestingToolState@59e4f0ca | |
| node2 | 3m 27.293s | 2025-11-17 21:55:01.537 | 4963 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for com.swirlds.demo.consistency.ConsistencyTestingToolState@72ddbb7d | |
| node2 | 3m 27.295s | 2025-11-17 21:55:01.539 | 4964 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 427 Timestamp: 2025-11-17T21:55:00.283498Z Next consensus number: 14747 Legacy running event hash: f59a22f7e47035c13187862976269b822864c76177140e2d10ec37a8d59229b0006cb609865f335284da624e499dfdba Legacy running event mnemonic: error-galaxy-improve-return Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1348217637 Root hash: c2497afd6452d40e94ffda8b4c9620fc4016a6ae357e9f390c8b0bbcba8dc5cdcd9f72ad9abb693112200599c952e58e (root) VirtualMap state / report-garden-medal-layer | |||||||||
| node1 | 3m 27.299s | 2025-11-17 21:55:01.543 | 4945 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for com.swirlds.demo.consistency.ConsistencyTestingToolState@4935d43f | |
| node1 | 3m 27.301s | 2025-11-17 21:55:01.545 | 4946 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 427 Timestamp: 2025-11-17T21:55:00.283498Z Next consensus number: 14747 Legacy running event hash: f59a22f7e47035c13187862976269b822864c76177140e2d10ec37a8d59229b0006cb609865f335284da624e499dfdba Legacy running event mnemonic: error-galaxy-improve-return Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1348217637 Root hash: c2497afd6452d40e94ffda8b4c9620fc4016a6ae357e9f390c8b0bbcba8dc5cdcd9f72ad9abb693112200599c952e58e (root) VirtualMap state / report-garden-medal-layer | |||||||||
| node2 | 3m 27.302s | 2025-11-17 21:55:01.546 | 4965 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+51+49.432009215Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 3m 27.302s | 2025-11-17 21:55:01.546 | 4966 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 400 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+51+49.432009215Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node2 | 3m 27.302s | 2025-11-17 21:55:01.546 | 4967 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 3m 27.309s | 2025-11-17 21:55:01.553 | 4947 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+51+49.359203665Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 3m 27.309s | 2025-11-17 21:55:01.553 | 4948 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 400 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+51+49.359203665Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node1 | 3m 27.309s | 2025-11-17 21:55:01.553 | 4949 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 3m 27.312s | 2025-11-17 21:55:01.556 | 4968 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 3m 27.313s | 2025-11-17 21:55:01.557 | 4969 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 427 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/427 {"round":427,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/427/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 3m 27.320s | 2025-11-17 21:55:01.564 | 4950 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 3m 27.320s | 2025-11-17 21:55:01.564 | 4951 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 427 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/427 {"round":427,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/427/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 3m 27.381s | 2025-11-17 21:55:01.625 | 5023 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for com.swirlds.demo.consistency.ConsistencyTestingToolState@59e4f0ca | |
| node0 | 3m 27.384s | 2025-11-17 21:55:01.628 | 5024 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 427 Timestamp: 2025-11-17T21:55:00.283498Z Next consensus number: 14747 Legacy running event hash: f59a22f7e47035c13187862976269b822864c76177140e2d10ec37a8d59229b0006cb609865f335284da624e499dfdba Legacy running event mnemonic: error-galaxy-improve-return Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1348217637 Root hash: c2497afd6452d40e94ffda8b4c9620fc4016a6ae357e9f390c8b0bbcba8dc5cdcd9f72ad9abb693112200599c952e58e (root) VirtualMap state / report-garden-medal-layer | |||||||||
| node0 | 3m 27.391s | 2025-11-17 21:55:01.635 | 5025 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+51+49.530997129Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 3m 27.391s | 2025-11-17 21:55:01.635 | 5026 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 400 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+51+49.530997129Z_seq0_minr1_maxr501_orgn0.pces | |||||||||
| node0 | 3m 27.391s | 2025-11-17 21:55:01.635 | 5027 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 3m 27.402s | 2025-11-17 21:55:01.646 | 5028 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 3m 27.402s | 2025-11-17 21:55:01.646 | 5029 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 427 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/427 {"round":427,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/427/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 4m 26.794s | 2025-11-17 21:56:01.038 | 6749 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 565 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 4m 26.879s | 2025-11-17 21:56:01.123 | 6514 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 565 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 4m 26.886s | 2025-11-17 21:56:01.130 | 6470 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 565 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 4m 26.894s | 2025-11-17 21:56:01.138 | 6451 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 565 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 4m 27.031s | 2025-11-17 21:56:01.275 | 6454 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 565 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/565 | |
| node3 | 4m 27.032s | 2025-11-17 21:56:01.276 | 6455 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for com.swirlds.demo.consistency.ConsistencyTestingToolState@733c05e2 | |
| node2 | 4m 27.066s | 2025-11-17 21:56:01.310 | 6517 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 565 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/565 | |
| node2 | 4m 27.067s | 2025-11-17 21:56:01.311 | 6518 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for com.swirlds.demo.consistency.ConsistencyTestingToolState@51df1934 | |
| node1 | 4m 27.102s | 2025-11-17 21:56:01.346 | 6473 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 565 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/565 | |
| node1 | 4m 27.102s | 2025-11-17 21:56:01.346 | 6474 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for com.swirlds.demo.consistency.ConsistencyTestingToolState@31d610f6 | |
| node3 | 4m 27.115s | 2025-11-17 21:56:01.359 | 6486 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for com.swirlds.demo.consistency.ConsistencyTestingToolState@733c05e2 | |
| node3 | 4m 27.117s | 2025-11-17 21:56:01.361 | 6487 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 565 Timestamp: 2025-11-17T21:56:00.190124Z Next consensus number: 18063 Legacy running event hash: 6066ed30d712bbd987d8102893f46ca042676266e97f03bb6919053723ebd1ffb334d42de6ea65d8cc09bbf61b5772d3 Legacy running event mnemonic: art-foam-build-lumber Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 41338665 Root hash: 6058f15229b793d2158d86c2a9e9f8ce24b8e06d1a8eeb51573f0ee94cc4a6bd9af753caeffaecac115635d0f8713207 (root) VirtualMap state / arch-still-dad-magic | |||||||||
| node3 | 4m 27.125s | 2025-11-17 21:56:01.369 | 6488 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+51+49.340458921Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+55+33.456332427Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 4m 27.125s | 2025-11-17 21:56:01.369 | 6489 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 538 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+55+33.456332427Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 4m 27.125s | 2025-11-17 21:56:01.369 | 6490 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 4m 27.127s | 2025-11-17 21:56:01.371 | 6491 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 4m 27.127s | 2025-11-17 21:56:01.371 | 6492 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 565 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/565 {"round":565,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/565/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 4m 27.129s | 2025-11-17 21:56:01.373 | 6493 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1 | |
| node2 | 4m 27.156s | 2025-11-17 21:56:01.400 | 6549 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for com.swirlds.demo.consistency.ConsistencyTestingToolState@51df1934 | |
| node2 | 4m 27.158s | 2025-11-17 21:56:01.402 | 6550 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 565 Timestamp: 2025-11-17T21:56:00.190124Z Next consensus number: 18063 Legacy running event hash: 6066ed30d712bbd987d8102893f46ca042676266e97f03bb6919053723ebd1ffb334d42de6ea65d8cc09bbf61b5772d3 Legacy running event mnemonic: art-foam-build-lumber Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 41338665 Root hash: 6058f15229b793d2158d86c2a9e9f8ce24b8e06d1a8eeb51573f0ee94cc4a6bd9af753caeffaecac115635d0f8713207 (root) VirtualMap state / arch-still-dad-magic | |||||||||
| node2 | 4m 27.167s | 2025-11-17 21:56:01.411 | 6551 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+51+49.432009215Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+55+33.456355536Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 4m 27.168s | 2025-11-17 21:56:01.412 | 6552 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 538 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+55+33.456355536Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 4m 27.168s | 2025-11-17 21:56:01.412 | 6553 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 4m 27.169s | 2025-11-17 21:56:01.413 | 6554 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 4m 27.170s | 2025-11-17 21:56:01.414 | 6555 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 565 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/565 {"round":565,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/565/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 4m 27.171s | 2025-11-17 21:56:01.415 | 6556 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1 | |
| node1 | 4m 27.186s | 2025-11-17 21:56:01.430 | 6505 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for com.swirlds.demo.consistency.ConsistencyTestingToolState@31d610f6 | |
| node1 | 4m 27.188s | 2025-11-17 21:56:01.432 | 6506 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 565 Timestamp: 2025-11-17T21:56:00.190124Z Next consensus number: 18063 Legacy running event hash: 6066ed30d712bbd987d8102893f46ca042676266e97f03bb6919053723ebd1ffb334d42de6ea65d8cc09bbf61b5772d3 Legacy running event mnemonic: art-foam-build-lumber Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 41338665 Root hash: 6058f15229b793d2158d86c2a9e9f8ce24b8e06d1a8eeb51573f0ee94cc4a6bd9af753caeffaecac115635d0f8713207 (root) VirtualMap state / arch-still-dad-magic | |||||||||
| node1 | 4m 27.194s | 2025-11-17 21:56:01.438 | 6507 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+51+49.359203665Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+55+33.419738959Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 4m 27.195s | 2025-11-17 21:56:01.439 | 6508 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 538 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+55+33.419738959Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 4m 27.195s | 2025-11-17 21:56:01.439 | 6509 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 4m 27.196s | 2025-11-17 21:56:01.440 | 6510 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 4m 27.196s | 2025-11-17 21:56:01.440 | 6511 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 565 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/565 {"round":565,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/565/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 4m 27.198s | 2025-11-17 21:56:01.442 | 6512 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1 | |
| node0 | 4m 27.216s | 2025-11-17 21:56:01.460 | 6762 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 565 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/565 | |
| node0 | 4m 27.217s | 2025-11-17 21:56:01.461 | 6763 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for com.swirlds.demo.consistency.ConsistencyTestingToolState@743f7f9e | |
| node0 | 4m 27.307s | 2025-11-17 21:56:01.551 | 6797 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for com.swirlds.demo.consistency.ConsistencyTestingToolState@743f7f9e | |
| node0 | 4m 27.309s | 2025-11-17 21:56:01.553 | 6808 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 565 Timestamp: 2025-11-17T21:56:00.190124Z Next consensus number: 18063 Legacy running event hash: 6066ed30d712bbd987d8102893f46ca042676266e97f03bb6919053723ebd1ffb334d42de6ea65d8cc09bbf61b5772d3 Legacy running event mnemonic: art-foam-build-lumber Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 41338665 Root hash: 6058f15229b793d2158d86c2a9e9f8ce24b8e06d1a8eeb51573f0ee94cc4a6bd9af753caeffaecac115635d0f8713207 (root) VirtualMap state / arch-still-dad-magic | |||||||||
| node0 | 4m 27.316s | 2025-11-17 21:56:01.560 | 6809 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+51+49.530997129Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+55+33.307680567Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node0 | 4m 27.317s | 2025-11-17 21:56:01.561 | 6810 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 538 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+55+33.307680567Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node0 | 4m 27.317s | 2025-11-17 21:56:01.561 | 6811 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 4m 27.318s | 2025-11-17 21:56:01.562 | 6812 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 4m 27.318s | 2025-11-17 21:56:01.562 | 6813 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 565 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/565 {"round":565,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/565/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 4m 27.320s | 2025-11-17 21:56:01.564 | 6814 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1 | |
| node0 | 5m 26.704s | 2025-11-17 21:57:00.948 | 8560 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 703 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 5m 26.746s | 2025-11-17 21:57:00.990 | 8043 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 703 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 5m 26.764s | 2025-11-17 21:57:01.008 | 8096 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 703 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 5m 26.770s | 2025-11-17 21:57:01.014 | 8087 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 703 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 5m 27.067s | 2025-11-17 21:57:01.311 | 8090 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 703 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/703 | |
| node2 | 5m 27.068s | 2025-11-17 21:57:01.312 | 8091 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for com.swirlds.demo.consistency.ConsistencyTestingToolState@58185a58 | |
| node3 | 5m 27.106s | 2025-11-17 21:57:01.350 | 8109 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 703 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/703 | |
| node0 | 5m 27.107s | 2025-11-17 21:57:01.351 | 8573 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 703 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/703 | |
| node3 | 5m 27.107s | 2025-11-17 21:57:01.351 | 8110 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for com.swirlds.demo.consistency.ConsistencyTestingToolState@60f00499 | |
| node0 | 5m 27.108s | 2025-11-17 21:57:01.352 | 8574 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for com.swirlds.demo.consistency.ConsistencyTestingToolState@7d016746 | |
| node1 | 5m 27.111s | 2025-11-17 21:57:01.355 | 8056 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 703 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/703 | |
| node1 | 5m 27.112s | 2025-11-17 21:57:01.356 | 8057 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for com.swirlds.demo.consistency.ConsistencyTestingToolState@3b91f38e | |
| node2 | 5m 27.149s | 2025-11-17 21:57:01.393 | 8122 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for com.swirlds.demo.consistency.ConsistencyTestingToolState@58185a58 | |
| node2 | 5m 27.151s | 2025-11-17 21:57:01.395 | 8123 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 703 Timestamp: 2025-11-17T21:57:00.011522Z Next consensus number: 21370 Legacy running event hash: 1c8f4285e1c718e16cc3e1cfd6d213924a5a8b4da4902b0fbd0e7e4f2c5ffd99abfa455cc337ddbe3ed1efd175c964c6 Legacy running event mnemonic: music-expose-shrimp-bring Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 109271563 Root hash: 1bc22f6479a085a3d3612caa6b0ebb786caea237000867406108d376c204122d9c0403af9f778ff5cd3a5eb6240a12f9 (root) VirtualMap state / joke-uncle-cross-entry | |||||||||
| node2 | 5m 27.159s | 2025-11-17 21:57:01.403 | 8124 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+51+49.432009215Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+55+33.456355536Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 5m 27.159s | 2025-11-17 21:57:01.403 | 8125 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 676 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+55+33.456355536Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 5m 27.159s | 2025-11-17 21:57:01.403 | 8126 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 5m 27.163s | 2025-11-17 21:57:01.407 | 8127 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 5m 27.163s | 2025-11-17 21:57:01.407 | 8128 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 703 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/703 {"round":703,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/703/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 5m 27.164s | 2025-11-17 21:57:01.408 | 8129 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/21 | |
| node3 | 5m 27.184s | 2025-11-17 21:57:01.428 | 8141 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for com.swirlds.demo.consistency.ConsistencyTestingToolState@60f00499 | |
| node3 | 5m 27.186s | 2025-11-17 21:57:01.430 | 8142 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 703 Timestamp: 2025-11-17T21:57:00.011522Z Next consensus number: 21370 Legacy running event hash: 1c8f4285e1c718e16cc3e1cfd6d213924a5a8b4da4902b0fbd0e7e4f2c5ffd99abfa455cc337ddbe3ed1efd175c964c6 Legacy running event mnemonic: music-expose-shrimp-bring Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 109271563 Root hash: 1bc22f6479a085a3d3612caa6b0ebb786caea237000867406108d376c204122d9c0403af9f778ff5cd3a5eb6240a12f9 (root) VirtualMap state / joke-uncle-cross-entry | |||||||||
| node3 | 5m 27.192s | 2025-11-17 21:57:01.436 | 8143 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+51+49.340458921Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+55+33.456332427Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 5m 27.192s | 2025-11-17 21:57:01.436 | 8144 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 676 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+55+33.456332427Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node0 | 5m 27.193s | 2025-11-17 21:57:01.437 | 8612 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for com.swirlds.demo.consistency.ConsistencyTestingToolState@7d016746 | |
| node3 | 5m 27.193s | 2025-11-17 21:57:01.437 | 8145 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 5m 27.194s | 2025-11-17 21:57:01.438 | 8091 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for com.swirlds.demo.consistency.ConsistencyTestingToolState@3b91f38e | |
| node0 | 5m 27.195s | 2025-11-17 21:57:01.439 | 8613 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 703 Timestamp: 2025-11-17T21:57:00.011522Z Next consensus number: 21370 Legacy running event hash: 1c8f4285e1c718e16cc3e1cfd6d213924a5a8b4da4902b0fbd0e7e4f2c5ffd99abfa455cc337ddbe3ed1efd175c964c6 Legacy running event mnemonic: music-expose-shrimp-bring Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 109271563 Root hash: 1bc22f6479a085a3d3612caa6b0ebb786caea237000867406108d376c204122d9c0403af9f778ff5cd3a5eb6240a12f9 (root) VirtualMap state / joke-uncle-cross-entry | |||||||||
| node1 | 5m 27.196s | 2025-11-17 21:57:01.440 | 8092 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 703 Timestamp: 2025-11-17T21:57:00.011522Z Next consensus number: 21370 Legacy running event hash: 1c8f4285e1c718e16cc3e1cfd6d213924a5a8b4da4902b0fbd0e7e4f2c5ffd99abfa455cc337ddbe3ed1efd175c964c6 Legacy running event mnemonic: music-expose-shrimp-bring Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 109271563 Root hash: 1bc22f6479a085a3d3612caa6b0ebb786caea237000867406108d376c204122d9c0403af9f778ff5cd3a5eb6240a12f9 (root) VirtualMap state / joke-uncle-cross-entry | |||||||||
| node3 | 5m 27.196s | 2025-11-17 21:57:01.440 | 8146 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 5m 27.197s | 2025-11-17 21:57:01.441 | 8147 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 703 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/703 {"round":703,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/703/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 5m 27.198s | 2025-11-17 21:57:01.442 | 8148 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/21 | |
| node0 | 5m 27.203s | 2025-11-17 21:57:01.447 | 8614 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+51+49.530997129Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+55+33.307680567Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node0 | 5m 27.203s | 2025-11-17 21:57:01.447 | 8615 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 676 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+55+33.307680567Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node0 | 5m 27.203s | 2025-11-17 21:57:01.447 | 8616 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 5m 27.204s | 2025-11-17 21:57:01.448 | 8093 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+51+49.359203665Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+55+33.419738959Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 5m 27.204s | 2025-11-17 21:57:01.448 | 8094 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 676 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+55+33.419738959Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 5m 27.204s | 2025-11-17 21:57:01.448 | 8095 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 5m 27.207s | 2025-11-17 21:57:01.451 | 8617 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 5m 27.208s | 2025-11-17 21:57:01.452 | 8618 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 703 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/703 {"round":703,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/703/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 5m 27.208s | 2025-11-17 21:57:01.452 | 8096 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 5m 27.208s | 2025-11-17 21:57:01.452 | 8097 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 703 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/703 {"round":703,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/703/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 5m 27.209s | 2025-11-17 21:57:01.453 | 8619 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/21 | |
| node1 | 5m 27.210s | 2025-11-17 21:57:01.454 | 8098 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/21 | |
| node4 | 5m 52.590s | 2025-11-17 21:57:26.834 | 1 | INFO | STARTUP | <main> | StaticPlatformBuilder: | ||
| ////////////////////// // Node is Starting // ////////////////////// | |||||||||
| node4 | 5m 52.675s | 2025-11-17 21:57:26.919 | 2 | DEBUG | STARTUP | <main> | StaticPlatformBuilder: | main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload] | |
| node4 | 5m 52.690s | 2025-11-17 21:57:26.934 | 3 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node4 | 5m 52.795s | 2025-11-17 21:57:27.039 | 4 | INFO | STARTUP | <main> | Browser: | The following nodes [4] are set to run locally | |
| node4 | 5m 52.824s | 2025-11-17 21:57:27.068 | 5 | DEBUG | STARTUP | <main> | BootstrapUtils: | Scanning the classpath for RuntimeConstructable classes | |
| node4 | 5m 54.070s | 2025-11-17 21:57:28.314 | 6 | DEBUG | STARTUP | <main> | BootstrapUtils: | Done with registerConstructables, time taken 1245ms | |
| node4 | 5m 54.079s | 2025-11-17 21:57:28.323 | 7 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | constructor called in Main. | |
| node4 | 5m 54.082s | 2025-11-17 21:57:28.326 | 8 | WARN | STARTUP | <main> | PlatformConfigUtils: | Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name. | |
| node4 | 5m 54.126s | 2025-11-17 21:57:28.370 | 9 | INFO | STARTUP | <main> | PrometheusEndpoint: | PrometheusEndpoint: Starting server listing on port: 9999 | |
| node4 | 5m 54.192s | 2025-11-17 21:57:28.436 | 10 | WARN | STARTUP | <main> | CryptoStatic: | There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB. | |
| node4 | 5m 54.192s | 2025-11-17 21:57:28.436 | 11 | DEBUG | STARTUP | <main> | CryptoStatic: | Started generating keys | |
| node4 | 5m 54.994s | 2025-11-17 21:57:29.238 | 12 | DEBUG | STARTUP | <main> | CryptoStatic: | Done generating keys | |
| node4 | 5m 55.084s | 2025-11-17 21:57:29.328 | 15 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 5m 55.091s | 2025-11-17 21:57:29.335 | 16 | INFO | STARTUP | <main> | StartupStateUtils: | The following saved states were found on disk: | |
| - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/289 - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/155 - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/21 - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 | |||||||||
| node4 | 5m 55.091s | 2025-11-17 21:57:29.335 | 17 | INFO | STARTUP | <main> | StartupStateUtils: | Loading latest state from disk. | |
| node4 | 5m 55.091s | 2025-11-17 21:57:29.335 | 18 | INFO | STARTUP | <main> | StartupStateUtils: | Loading signed state from disk: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/289 | |
| node4 | 5m 55.100s | 2025-11-17 21:57:29.344 | 19 | INFO | STATE_TO_DISK | <main> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp | |
| node4 | 5m 55.213s | 2025-11-17 21:57:29.457 | 29 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | New State Constructed. | |
| node4 | 5m 55.946s | 2025-11-17 21:57:30.190 | 31 | INFO | STARTUP | <main> | StartupStateUtils: | Loaded state's hash is the same as when it was saved. | |
| node4 | 5m 55.951s | 2025-11-17 21:57:30.195 | 32 | INFO | STARTUP | <main> | StartupStateUtils: | Platform has loaded a saved state {"round":289,"consensusTimestamp":"2025-11-17T21:54:00.431712425Z"} [com.swirlds.logging.legacy.payload.SavedStateLoadedPayload] | |
| node4 | 5m 55.956s | 2025-11-17 21:57:30.200 | 35 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 5m 55.957s | 2025-11-17 21:57:30.201 | 38 | INFO | STARTUP | <main> | BootstrapUtils: | Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]. | |
| node4 | 5m 55.962s | 2025-11-17 21:57:30.206 | 39 | INFO | STARTUP | <main> | AddressBookInitializer: | Using the loaded state's address book and weight values. | |
| node4 | 5m 55.971s | 2025-11-17 21:57:30.215 | 40 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 5m 55.974s | 2025-11-17 21:57:30.218 | 41 | INFO | STARTUP | <main> | ConsistencyTestingToolMain: | returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=] | |
| node4 | 5m 57.068s | 2025-11-17 21:57:31.312 | 42 | INFO | STARTUP | <main> | OSHealthChecker: | ||
| PASSED - Clock Source Speed Check Report[callsPerSec=26390175] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=254190, randomLong=4298556713359435938, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=8920, randomLong=-1809017470899760166, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1004720, data=35, exception=null] OS Health Check Report - Complete (took 1021 ms) | |||||||||
| node4 | 5m 57.099s | 2025-11-17 21:57:31.343 | 43 | DEBUG | STARTUP | <main> | BootstrapUtils: | jvmPauseDetectorThread started | |
| node4 | 5m 57.223s | 2025-11-17 21:57:31.467 | 44 | INFO | STARTUP | <main> | PcesUtilities: | Span compaction completed for data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+51+49.234736072Z_seq0_minr1_maxr501_orgn0.pces, new upper bound is 396 | |
| node4 | 5m 57.225s | 2025-11-17 21:57:31.469 | 45 | INFO | STARTUP | <main> | StandardScratchpad: | Scratchpad platform.iss contents: | |
| LAST_ISS_ROUND null | |||||||||
| node4 | 5m 57.227s | 2025-11-17 21:57:31.471 | 46 | INFO | STARTUP | <main> | PlatformBuilder: | Default platform pool parallelism: 8 | |
| node4 | 5m 57.304s | 2025-11-17 21:57:31.548 | 47 | INFO | STARTUP | <main> | SwirldsPlatform: | Starting with roster history: | |
| RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "InvOnA==", "port": 30124 }, { "ipAddressV4": "CoAAHQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "Ih3SfA==", "port": 30125 }, { "ipAddressV4": "CoAAIQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "iHQMHg==", "port": 30126 }, { "ipAddressV4": "CoAAHg==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "IkQRmQ==", "port": 30127 }, { "ipAddressV4": "CoAAHw==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "Ike+6w==", "port": 30128 }, { "ipAddressV4": "CoAAIA==", "port": 30128 }] }] } | |||||||||
| node4 | 5m 57.326s | 2025-11-17 21:57:31.570 | 48 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | State initialized with state long 1960681300599479229. | |
| node4 | 5m 57.326s | 2025-11-17 21:57:31.570 | 49 | INFO | STARTUP | <main> | ConsistencyTestingToolState: | State initialized with 289 rounds handled. | |
| node4 | 5m 57.327s | 2025-11-17 21:57:31.571 | 50 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv | |
| node4 | 5m 57.327s | 2025-11-17 21:57:31.571 | 51 | INFO | STARTUP | <main> | TransactionHandlingHistory: | Log file found. Parsing previous history | |
| node4 | 5m 57.364s | 2025-11-17 21:57:31.608 | 52 | INFO | STARTUP | <main> | StateInitializer: | The platform is using the following initial state: | |
| Round: 289 Timestamp: 2025-11-17T21:54:00.431712425Z Next consensus number: 10264 Legacy running event hash: 5830c815b8bc7a1a17b6c3399345755c7a82ba9cd852062f27d64f0813ffefeb3451a5133de5b82a8810b35875d9a942 Legacy running event mnemonic: alcohol-word-trigger-drive Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -445431835 Root hash: 7d01d117598d32e09001027345fa0af7a7c81a9e6ea46435bf6df5eb58114fa31188f2837dbba7f78d4c7a0c160c4d98 (root) VirtualMap state / parent-tag-coyote-foster | |||||||||
| node4 | 5m 57.369s | 2025-11-17 21:57:31.613 | 54 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Starting the ReconnectController | |
| node4 | 5m 57.554s | 2025-11-17 21:57:31.798 | 55 | INFO | EVENT_STREAM | <main> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: 5830c815b8bc7a1a17b6c3399345755c7a82ba9cd852062f27d64f0813ffefeb3451a5133de5b82a8810b35875d9a942 | |
| node4 | 5m 57.562s | 2025-11-17 21:57:31.806 | 56 | INFO | STARTUP | <platformForkJoinThread-5> | Shadowgraph: | Shadowgraph starting from expiration threshold 262 | |
| node4 | 5m 57.567s | 2025-11-17 21:57:31.811 | 58 | INFO | STARTUP | <<start-node-4>> | ConsistencyTestingToolMain: | init called in Main for node 4. | |
| node4 | 5m 57.567s | 2025-11-17 21:57:31.811 | 59 | INFO | STARTUP | <<start-node-4>> | SwirldsPlatform: | Starting platform 4 | |
| node4 | 5m 57.568s | 2025-11-17 21:57:31.812 | 60 | INFO | STARTUP | <<platform: recycle-bin-cleanup>> | RecycleBinImpl: | Deleted 0 files from the recycle bin. | |
| node4 | 5m 57.571s | 2025-11-17 21:57:31.815 | 61 | INFO | STARTUP | <<start-node-4>> | CycleFinder: | No cyclical back pressure detected in wiring model. | |
| node4 | 5m 57.573s | 2025-11-17 21:57:31.817 | 62 | INFO | STARTUP | <<start-node-4>> | DirectSchedulerChecks: | No illegal direct scheduler use detected in the wiring model. | |
| node4 | 5m 57.573s | 2025-11-17 21:57:31.817 | 63 | INFO | STARTUP | <<start-node-4>> | InputWireChecks: | All input wires have been bound. | |
| node4 | 5m 57.575s | 2025-11-17 21:57:31.819 | 64 | INFO | STARTUP | <<start-node-4>> | SwirldsPlatform: | replaying preconsensus event stream starting at 262 | |
| node4 | 5m 57.581s | 2025-11-17 21:57:31.825 | 65 | INFO | PLATFORM_STATUS | <platformForkJoinThread-5> | StatusStateMachine: | Platform spent 159.0 ms in STARTING_UP. Now in REPLAYING_EVENTS | |
| node4 | 5m 57.821s | 2025-11-17 21:57:32.065 | 66 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:4 H:586017b0ed57 BR:284), num remaining: 4 | |
| node4 | 5m 57.823s | 2025-11-17 21:57:32.067 | 67 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:0 H:e45e78579297 BR:284), num remaining: 3 | |
| node4 | 5m 57.824s | 2025-11-17 21:57:32.068 | 68 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:3 H:005d8bab619f BR:284), num remaining: 2 | |
| node4 | 5m 57.824s | 2025-11-17 21:57:32.068 | 69 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:2 H:bf52076e10f1 BR:284), num remaining: 1 | |
| node4 | 5m 57.825s | 2025-11-17 21:57:32.069 | 70 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:1 H:a87f4ed2ca9e BR:284), num remaining: 0 | |
| node4 | 5m 58.577s | 2025-11-17 21:57:32.821 | 1031 | INFO | STARTUP | <<start-node-4>> | PcesReplayer: | Replayed 4,651 preconsensus events with max birth round 396. These events contained 6,498 transactions. 106 rounds reached consensus spanning 46.0 seconds of consensus time. The latest round to reach consensus is round 395. Replay took 1.0 second. | |
| node4 | 5m 58.581s | 2025-11-17 21:57:32.825 | 1032 | INFO | STARTUP | <<app: appMain 4>> | ConsistencyTestingToolMain: | run called in Main. | |
| node4 | 5m 58.582s | 2025-11-17 21:57:32.826 | 1033 | INFO | PLATFORM_STATUS | <platformForkJoinThread-8> | StatusStateMachine: | Platform spent 999.0 ms in REPLAYING_EVENTS. Now in OBSERVING | |
| node4 | 5m 59.447s | 2025-11-17 21:57:33.691 | 1054 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Preparing for reconnect, stopping gossip | |
| node4 | 5m 59.447s | 2025-11-17 21:57:33.691 | 1055 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | RpcPeerHandler: | SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=395,newEventBirthRound=396,ancientThreshold=367,expiredThreshold=294] remote ev=EventWindow[latestConsensusRound=778,newEventBirthRound=779,ancientThreshold=751,expiredThreshold=677] | |
| node4 | 5m 59.447s | 2025-11-17 21:57:33.691 | 1058 | INFO | RECONNECT | <<platform-core: SyncProtocolWith2 4 to 2>> | RpcPeerHandler: | SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=395,newEventBirthRound=396,ancientThreshold=367,expiredThreshold=294] remote ev=EventWindow[latestConsensusRound=778,newEventBirthRound=779,ancientThreshold=751,expiredThreshold=677] | |
| node4 | 5m 59.447s | 2025-11-17 21:57:33.691 | 1057 | INFO | RECONNECT | <<platform-core: SyncProtocolWith0 4 to 0>> | RpcPeerHandler: | SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=395,newEventBirthRound=396,ancientThreshold=367,expiredThreshold=294] remote ev=EventWindow[latestConsensusRound=778,newEventBirthRound=779,ancientThreshold=751,expiredThreshold=677] | |
| node4 | 5m 59.447s | 2025-11-17 21:57:33.691 | 1056 | INFO | RECONNECT | <<platform-core: SyncProtocolWith3 4 to 3>> | RpcPeerHandler: | SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=395,newEventBirthRound=396,ancientThreshold=367,expiredThreshold=294] remote ev=EventWindow[latestConsensusRound=778,newEventBirthRound=779,ancientThreshold=751,expiredThreshold=677] | |
| node4 | 5m 59.447s | 2025-11-17 21:57:33.691 | 1059 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Preparing for reconnect, start clearing queues | |
| node4 | 5m 59.447s | 2025-11-17 21:57:33.691 | 1060 | INFO | PLATFORM_STATUS | <platformForkJoinThread-2> | StatusStateMachine: | Platform spent 864.0 ms in OBSERVING. Now in BEHIND | |
| node3 | 5m 59.517s | 2025-11-17 21:57:33.761 | 9054 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 3 to 4>> | RpcPeerHandler: | OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=778,newEventBirthRound=779,ancientThreshold=751,expiredThreshold=677] remote ev=EventWindow[latestConsensusRound=395,newEventBirthRound=396,ancientThreshold=367,expiredThreshold=294] | |
| node0 | 5m 59.518s | 2025-11-17 21:57:33.762 | 9562 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 0 to 4>> | RpcPeerHandler: | OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=778,newEventBirthRound=779,ancientThreshold=751,expiredThreshold=677] remote ev=EventWindow[latestConsensusRound=395,newEventBirthRound=396,ancientThreshold=367,expiredThreshold=294] | |
| node1 | 5m 59.518s | 2025-11-17 21:57:33.762 | 9009 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | RpcPeerHandler: | OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=778,newEventBirthRound=779,ancientThreshold=751,expiredThreshold=677] remote ev=EventWindow[latestConsensusRound=395,newEventBirthRound=396,ancientThreshold=367,expiredThreshold=294] | |
| node2 | 5m 59.518s | 2025-11-17 21:57:33.762 | 8995 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 2 to 4>> | RpcPeerHandler: | OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=778,newEventBirthRound=779,ancientThreshold=751,expiredThreshold=677] remote ev=EventWindow[latestConsensusRound=395,newEventBirthRound=396,ancientThreshold=367,expiredThreshold=294] | |
| node4 | 5m 59.600s | 2025-11-17 21:57:33.844 | 1061 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Queues have been cleared | |
| node4 | 5m 59.600s | 2025-11-17 21:57:33.844 | 1062 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Waiting for a state to be obtained from a peer | |
| node1 | 5m 59.694s | 2025-11-17 21:57:33.938 | 9018 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | ReconnectStateTeacher: | Starting reconnect in the role of the sender {"receiving":false,"nodeId":1,"otherNodeId":4,"round":778} [com.swirlds.logging.legacy.payload.ReconnectStartPayload] | |
| node1 | 5m 59.695s | 2025-11-17 21:57:33.939 | 9019 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | ReconnectStateTeacher: | The following state will be sent to the learner: | |
| Round: 778 Timestamp: 2025-11-17T21:57:32.621633Z Next consensus number: 23170 Legacy running event hash: cded633e125216680724560f8317f382a4943d0a40cd5b6f2078a22272d63d74422549b1137d91ee0240b54f3e33ae00 Legacy running event mnemonic: draw-guitar-engage-snack Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2138287084 Root hash: a73580b021f12b81f557cfa8e8efb5dd1f4fee86da9c37d6e1ae9832e08dc31241af1084164b2228b42ed06786cac5a8 (root) VirtualMap state / bonus-tube-dilemma-ice | |||||||||
| node1 | 5m 59.695s | 2025-11-17 21:57:33.939 | 9020 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | ReconnectStateTeacher: | Sending signatures from nodes 0, 1, 3 (signing weight = 37500000000/50000000000) for state hash a73580b021f12b81f557cfa8e8efb5dd1f4fee86da9c37d6e1ae9832e08dc31241af1084164b2228b42ed06786cac5a8 | |
| node1 | 5m 59.696s | 2025-11-17 21:57:33.940 | 9021 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | ReconnectStateTeacher: | Starting synchronization in the role of the sender. | |
| node4 | 5m 59.765s | 2025-11-17 21:57:34.009 | 1063 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | ReconnectStatePeerProtocol: | Starting reconnect in role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":1,"round":395} [com.swirlds.logging.legacy.payload.ReconnectStartPayload] | |
| node4 | 5m 59.766s | 2025-11-17 21:57:34.010 | 1064 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | ReconnectStateLearner: | Receiving signed state signatures | |
| node4 | 5m 59.767s | 2025-11-17 21:57:34.011 | 1065 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | ReconnectStateLearner: | Received signatures from nodes 0, 1, 3 | |
| node1 | 5m 59.821s | 2025-11-17 21:57:34.065 | 9040 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | TeachingSynchronizer: | sending tree rooted at com.swirlds.virtualmap.VirtualMap with route [] | |
| node1 | 5m 59.836s | 2025-11-17 21:57:34.080 | 9041 | INFO | RECONNECT | <<work group teaching-synchronizer: async-input-stream #0>> | AsyncInputStream: | com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@152ec78c start run() | |
| node4 | 5m 59.972s | 2025-11-17 21:57:34.216 | 1094 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | learner calls receiveTree() | |
| node4 | 5m 59.972s | 2025-11-17 21:57:34.216 | 1095 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | synchronizing tree | |
| node4 | 5m 59.973s | 2025-11-17 21:57:34.217 | 1096 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | receiving tree rooted at com.swirlds.virtualmap.VirtualMap with route [] | |
| node4 | 5m 59.982s | 2025-11-17 21:57:34.226 | 1097 | INFO | RECONNECT | <<work group learning-synchronizer: async-input-stream #0>> | AsyncInputStream: | com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@1a0016f1 start run() | |
| node4 | 6.001m | 2025-11-17 21:57:34.284 | 1098 | INFO | RECONNECT | <<work group learning-synchronizer: async-input-stream #0>> | ReconnectNodeRemover: | setPathInformation(): firstLeafPath: 4 -> 4, lastLeafPath: 8 -> 8 | |
| node4 | 6.001m | 2025-11-17 21:57:34.285 | 1099 | INFO | RECONNECT | <<work group learning-synchronizer: async-input-stream #0>> | ReconnectNodeRemover: | setPathInformation(): done | |
| node4 | 6.004m | 2025-11-17 21:57:34.457 | 1100 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushTask: | learner thread finished the learning loop for the current subtree | |
| node4 | 6.004m | 2025-11-17 21:57:34.458 | 1101 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushVirtualTreeView: | call nodeRemover.allNodesReceived() | |
| node4 | 6.004m | 2025-11-17 21:57:34.458 | 1102 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | ReconnectNodeRemover: | allNodesReceived() | |
| node4 | 6.004m | 2025-11-17 21:57:34.458 | 1103 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | ReconnectNodeRemover: | allNodesReceived(): done | |
| node4 | 6.004m | 2025-11-17 21:57:34.459 | 1104 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushVirtualTreeView: | call root.endLearnerReconnect() | |
| node4 | 6.004m | 2025-11-17 21:57:34.459 | 1105 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | VirtualMap: | call reconnectIterator.close() | |
| node4 | 6.004m | 2025-11-17 21:57:34.459 | 1106 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | VirtualMap: | call setHashPrivate() | |
| node4 | 6.004m | 2025-11-17 21:57:34.480 | 1116 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | VirtualMap: | call postInit() | |
| node4 | 6.004m | 2025-11-17 21:57:34.481 | 1118 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | VirtualMap: | endLearnerReconnect() complete | |
| node4 | 6.004m | 2025-11-17 21:57:34.481 | 1119 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushVirtualTreeView: | close() complete | |
| node4 | 6.004m | 2025-11-17 21:57:34.482 | 1120 | INFO | RECONNECT | <<work group learning-synchronizer: learner-task #2>> | LearnerPushTask: | learner thread closed input, output, and view for the current subtree | |
| node4 | 6.004m | 2025-11-17 21:57:34.482 | 1121 | INFO | RECONNECT | <<work group learning-synchronizer: async-input-stream #0>> | AsyncInputStream: | com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@1a0016f1 finish run() | |
| node4 | 6.004m | 2025-11-17 21:57:34.483 | 1122 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | received tree rooted at com.swirlds.virtualmap.VirtualMap with route [] | |
| node4 | 6.004m | 2025-11-17 21:57:34.483 | 1123 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | synchronization complete | |
| node4 | 6.004m | 2025-11-17 21:57:34.484 | 1124 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | learner calls initialize() | |
| node4 | 6.004m | 2025-11-17 21:57:34.484 | 1125 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | initializing tree | |
| node4 | 6.004m | 2025-11-17 21:57:34.484 | 1126 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | initialization complete | |
| node4 | 6.004m | 2025-11-17 21:57:34.484 | 1127 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | learner calls hash() | |
| node4 | 6.004m | 2025-11-17 21:57:34.484 | 1128 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | hashing tree | |
| node4 | 6.004m | 2025-11-17 21:57:34.484 | 1129 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | hashing complete | |
| node4 | 6.004m | 2025-11-17 21:57:34.485 | 1130 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | learner calls logStatistics() | |
| node4 | 6.004m | 2025-11-17 21:57:34.487 | 1131 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | Finished synchronization {"timeInSeconds":0.266,"hashTimeInSeconds":0.0,"initializationTimeInSeconds":0.0,"totalNodes":9,"leafNodes":5,"redundantLeafNodes":2,"internalNodes":4,"redundantInternalNodes":0} [com.swirlds.logging.legacy.payload.SynchronizationCompletePayload] | |
| node4 | 6.004m | 2025-11-17 21:57:34.488 | 1132 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | ReconnectMapMetrics: transfersFromTeacher=9; transfersFromLearner=8; internalHashes=3; internalCleanHashes=0; internalData=0; internalCleanData=0; leafHashes=5; leafCleanHashes=2; leafData=5; leafCleanData=2 | |
| node4 | 6.004m | 2025-11-17 21:57:34.488 | 1133 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | LearningSynchronizer: | learner is done synchronizing | |
| node4 | 6.004m | 2025-11-17 21:57:34.489 | 1134 | INFO | STARTUP | <<platform-core: SyncProtocolWith1 4 to 1>> | ConsistencyTestingToolState: | New State Constructed. | |
| node4 | 6.004m | 2025-11-17 21:57:34.494 | 1135 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | ReconnectStateLearner: | Reconnect data usage report {"dataMegabytes":0.005864143371582031} [com.swirlds.logging.legacy.payload.ReconnectDataUsagePayload] | |
| node1 | 6.004m | 2025-11-17 21:57:34.501 | 9045 | INFO | RECONNECT | <<work group teaching-synchronizer: async-input-stream #0>> | AsyncInputStream: | com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@152ec78c finish run() | |
| node1 | 6.004m | 2025-11-17 21:57:34.504 | 9046 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | TeachingSynchronizer: | finished sending tree | |
| node1 | 6.004m | 2025-11-17 21:57:34.507 | 9049 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | ReconnectStateTeacher: | Finished synchronization in the role of the sender. | |
| node1 | 6.005m | 2025-11-17 21:57:34.567 | 9050 | INFO | RECONNECT | <<platform-core: SyncProtocolWith4 1 to 4>> | ReconnectStateTeacher: | Finished reconnect in the role of the sender. {"receiving":false,"nodeId":1,"otherNodeId":4,"round":778} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload] | |
| node4 | 6.006m | 2025-11-17 21:57:34.582 | 1136 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | ReconnectStatePeerProtocol: | Finished reconnect in the role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":1,"round":778} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload] | |
| node4 | 6.006m | 2025-11-17 21:57:34.583 | 1137 | INFO | RECONNECT | <<platform-core: SyncProtocolWith1 4 to 1>> | ReconnectStatePeerProtocol: | Information for state received during reconnect: | |
| Round: 778 Timestamp: 2025-11-17T21:57:32.621633Z Next consensus number: 23170 Legacy running event hash: cded633e125216680724560f8317f382a4943d0a40cd5b6f2078a22272d63d74422549b1137d91ee0240b54f3e33ae00 Legacy running event mnemonic: draw-guitar-engage-snack Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2138287084 Root hash: a73580b021f12b81f557cfa8e8efb5dd1f4fee86da9c37d6e1ae9832e08dc31241af1084164b2228b42ed06786cac5a8 (root) VirtualMap state / bonus-tube-dilemma-ice | |||||||||
| node4 | 6.006m | 2025-11-17 21:57:34.584 | 1138 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | A state was obtained from a peer | |
| node4 | 6.006m | 2025-11-17 21:57:34.585 | 1139 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | The state obtained from a peer was validated | |
| node4 | 6.006m | 2025-11-17 21:57:34.586 | 1141 | DEBUG | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | `loadState` : reloading state | |
| node4 | 6.006m | 2025-11-17 21:57:34.586 | 1142 | INFO | STARTUP | <<platform-core: reconnectController>> | ConsistencyTestingToolState: | State initialized with state long 534881315071146681. | |
| node4 | 6.006m | 2025-11-17 21:57:34.587 | 1143 | INFO | STARTUP | <<platform-core: reconnectController>> | ConsistencyTestingToolState: | State initialized with 778 rounds handled. | |
| node4 | 6.006m | 2025-11-17 21:57:34.587 | 1144 | INFO | STARTUP | <<platform-core: reconnectController>> | TransactionHandlingHistory: | Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv | |
| node4 | 6.006m | 2025-11-17 21:57:34.587 | 1145 | INFO | STARTUP | <<platform-core: reconnectController>> | TransactionHandlingHistory: | Log file found. Parsing previous history | |
| node4 | 6.006m | 2025-11-17 21:57:34.611 | 1152 | INFO | STATE_TO_DISK | <<platform-core: reconnectController>> | DefaultSavedStateController: | Signed state from round 778 created, will eventually be written to disk, for reason: RECONNECT | |
| node4 | 6.006m | 2025-11-17 21:57:34.612 | 1153 | INFO | PLATFORM_STATUS | <platformForkJoinThread-4> | StatusStateMachine: | Platform spent 918.0 ms in BEHIND. Now in RECONNECT_COMPLETE | |
| node4 | 6.006m | 2025-11-17 21:57:34.612 | 1155 | INFO | STARTUP | <platformForkJoinThread-5> | Shadowgraph: | Shadowgraph starting from expiration threshold 751 | |
| node4 | 6.006m | 2025-11-17 21:57:34.614 | 1157 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 778 state to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/778 | |
| node4 | 6.006m | 2025-11-17 21:57:34.615 | 1158 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/3 for com.swirlds.demo.consistency.ConsistencyTestingToolState@5d621735 | |
| node4 | 6.006m | 2025-11-17 21:57:34.626 | 1171 | INFO | EVENT_STREAM | <<platform-core: reconnectController>> | DefaultConsensusEventStream: | EventStreamManager::updateRunningHash: cded633e125216680724560f8317f382a4943d0a40cd5b6f2078a22272d63d74422549b1137d91ee0240b54f3e33ae00 | |
| node4 | 6.006m | 2025-11-17 21:57:34.627 | 1173 | INFO | STARTUP | <platformForkJoinThread-1> | PcesFileManager: | Due to recent operations on this node, the local preconsensus event stream will have a discontinuity. The last file with the old origin round is 2025-11-17T21+51+49.234736072Z_seq0_minr1_maxr396_orgn0.pces. All future files will have an origin round of 778. | |
| node4 | 6.006m | 2025-11-17 21:57:34.628 | 1177 | INFO | RECONNECT | <<platform-core: reconnectController>> | ReconnectController: | Reconnect almost done resuming gossip | |
| node4 | 6.008m | 2025-11-17 21:57:34.751 | 1204 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/3 for com.swirlds.demo.consistency.ConsistencyTestingToolState@5d621735 | |
| node4 | 6.008m | 2025-11-17 21:57:34.754 | 1205 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 778 Timestamp: 2025-11-17T21:57:32.621633Z Next consensus number: 23170 Legacy running event hash: cded633e125216680724560f8317f382a4943d0a40cd5b6f2078a22272d63d74422549b1137d91ee0240b54f3e33ae00 Legacy running event mnemonic: draw-guitar-engage-snack Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2138287084 Root hash: a73580b021f12b81f557cfa8e8efb5dd1f4fee86da9c37d6e1ae9832e08dc31241af1084164b2228b42ed06786cac5a8 (root) VirtualMap state / bonus-tube-dilemma-ice | |||||||||
| node4 | 6.009m | 2025-11-17 21:57:34.786 | 1206 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus file on disk. | |
| File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+51+49.234736072Z_seq0_minr1_maxr396_orgn0.pces | |||||||||
| node4 | 6.009m | 2025-11-17 21:57:34.786 | 1207 | WARN | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | No preconsensus event files meeting specified criteria found to copy. Lower bound: 751 | |
| node4 | 6.009m | 2025-11-17 21:57:34.792 | 1208 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 778 to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/778 {"round":778,"freezeState":false,"reason":"RECONNECT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/778/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 6.009m | 2025-11-17 21:57:34.795 | 1209 | INFO | PLATFORM_STATUS | <platformForkJoinThread-8> | StatusStateMachine: | Platform spent 182.0 ms in RECONNECT_COMPLETE. Now in CHECKING | |
| node4 | 6.010m | 2025-11-17 21:57:34.822 | 1210 | INFO | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ] | |
| node4 | 6.010m | 2025-11-17 21:57:34.824 | 1211 | DEBUG | STARTUP | <<platform-core: MetricsThread #0>> | LegacyCsvWriter: | CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ] | |
| node4 | 6m 1.532s | 2025-11-17 21:57:35.776 | 1212 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:0 H:0c384f996e18 BR:776), num remaining: 3 | |
| node4 | 6m 1.533s | 2025-11-17 21:57:35.777 | 1213 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:1 H:5580496f744b BR:776), num remaining: 2 | |
| node4 | 6m 1.533s | 2025-11-17 21:57:35.777 | 1214 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:3 H:5da20fe92184 BR:776), num remaining: 1 | |
| node4 | 6m 1.534s | 2025-11-17 21:57:35.778 | 1215 | INFO | STARTUP | <<scheduler ConsensusEngine>> | ConsensusImpl: | Found init judge (CR:2 H:c7a79f5425e3 BR:776), num remaining: 0 | |
| node4 | 6m 5.153s | 2025-11-17 21:57:39.397 | 1338 | INFO | PLATFORM_STATUS | <platformForkJoinThread-2> | StatusStateMachine: | Platform spent 4.6 s in CHECKING. Now in ACTIVE | |
| node3 | 6m 26.854s | 2025-11-17 21:58:01.098 | 9737 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 838 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 6m 26.880s | 2025-11-17 21:58:01.124 | 9662 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 838 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node1 | 6m 26.887s | 2025-11-17 21:58:01.131 | 9703 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 838 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node0 | 6m 26.894s | 2025-11-17 21:58:01.138 | 10251 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 838 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node4 | 6m 26.977s | 2025-11-17 21:58:01.221 | 1860 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 838 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node0 | 6m 27.035s | 2025-11-17 21:58:01.279 | 10254 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 838 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/838 | |
| node0 | 6m 27.036s | 2025-11-17 21:58:01.280 | 10255 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for com.swirlds.demo.consistency.ConsistencyTestingToolState@396a67d4 | |
| node4 | 6m 27.093s | 2025-11-17 21:58:01.337 | 1863 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 838 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/838 | |
| node4 | 6m 27.094s | 2025-11-17 21:58:01.338 | 1864 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/10 for com.swirlds.demo.consistency.ConsistencyTestingToolState@35175250 | |
| node0 | 6m 27.121s | 2025-11-17 21:58:01.365 | 10294 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for com.swirlds.demo.consistency.ConsistencyTestingToolState@396a67d4 | |
| node0 | 6m 27.123s | 2025-11-17 21:58:01.367 | 10295 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 838 Timestamp: 2025-11-17T21:58:00.071623263Z Next consensus number: 25222 Legacy running event hash: 20b58560604e8b03f7b23ef2a081a72232b639ab04b0fdcc66c53628c8abdef49072d853ccde6ef0d3b613fe1e32812b Legacy running event mnemonic: birth-favorite-antenna-panic Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 515487362 Root hash: ecf991c82965026809da59be9e1b372fa51f761d37d7d7eea5b9471a5e883c840d00dc7630248e524b79a1b724680c02 (root) VirtualMap state / genius-shove-coach-escape | |||||||||
| node1 | 6m 27.126s | 2025-11-17 21:58:01.370 | 9706 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 838 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/838 | |
| node1 | 6m 27.127s | 2025-11-17 21:58:01.371 | 9707 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/52 for com.swirlds.demo.consistency.ConsistencyTestingToolState@42dd6ca7 | |
| node0 | 6m 27.129s | 2025-11-17 21:58:01.373 | 10296 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+51+49.530997129Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+55+33.307680567Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node0 | 6m 27.129s | 2025-11-17 21:58:01.373 | 10297 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 811 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+55+33.307680567Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node0 | 6m 27.131s | 2025-11-17 21:58:01.375 | 10298 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 6m 27.138s | 2025-11-17 21:58:01.382 | 10299 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 6m 27.138s | 2025-11-17 21:58:01.382 | 10300 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 838 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/838 {"round":838,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/838/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 6m 27.140s | 2025-11-17 21:58:01.384 | 10301 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/155 | |
| node2 | 6m 27.197s | 2025-11-17 21:58:01.441 | 9665 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 838 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/838 | |
| node2 | 6m 27.197s | 2025-11-17 21:58:01.441 | 9666 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for com.swirlds.demo.consistency.ConsistencyTestingToolState@648bfa59 | |
| node1 | 6m 27.214s | 2025-11-17 21:58:01.458 | 9738 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/52 for com.swirlds.demo.consistency.ConsistencyTestingToolState@42dd6ca7 | |
| node1 | 6m 27.216s | 2025-11-17 21:58:01.460 | 9739 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 838 Timestamp: 2025-11-17T21:58:00.071623263Z Next consensus number: 25222 Legacy running event hash: 20b58560604e8b03f7b23ef2a081a72232b639ab04b0fdcc66c53628c8abdef49072d853ccde6ef0d3b613fe1e32812b Legacy running event mnemonic: birth-favorite-antenna-panic Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 515487362 Root hash: ecf991c82965026809da59be9e1b372fa51f761d37d7d7eea5b9471a5e883c840d00dc7630248e524b79a1b724680c02 (root) VirtualMap state / genius-shove-coach-escape | |||||||||
| node1 | 6m 27.222s | 2025-11-17 21:58:01.466 | 9748 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+51+49.359203665Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+55+33.419738959Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 6m 27.222s | 2025-11-17 21:58:01.466 | 9749 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 811 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+55+33.419738959Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 6m 27.222s | 2025-11-17 21:58:01.466 | 9750 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 6m 27.229s | 2025-11-17 21:58:01.473 | 9751 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 6m 27.229s | 2025-11-17 21:58:01.473 | 9752 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 838 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/838 {"round":838,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/838/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 6m 27.229s | 2025-11-17 21:58:01.473 | 1909 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/10 for com.swirlds.demo.consistency.ConsistencyTestingToolState@35175250 | |
| node1 | 6m 27.231s | 2025-11-17 21:58:01.475 | 9753 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/155 | |
| node4 | 6m 27.231s | 2025-11-17 21:58:01.475 | 1910 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 838 Timestamp: 2025-11-17T21:58:00.071623263Z Next consensus number: 25222 Legacy running event hash: 20b58560604e8b03f7b23ef2a081a72232b639ab04b0fdcc66c53628c8abdef49072d853ccde6ef0d3b613fe1e32812b Legacy running event mnemonic: birth-favorite-antenna-panic Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 515487362 Root hash: ecf991c82965026809da59be9e1b372fa51f761d37d7d7eea5b9471a5e883c840d00dc7630248e524b79a1b724680c02 (root) VirtualMap state / genius-shove-coach-escape | |||||||||
| node4 | 6m 27.240s | 2025-11-17 21:58:01.484 | 1911 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+51+49.234736072Z_seq0_minr1_maxr396_orgn0.pces Last file: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+57+35.203443728Z_seq1_minr751_maxr1251_orgn778.pces | |||||||||
| node4 | 6m 27.241s | 2025-11-17 21:58:01.485 | 1912 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 811 File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+57+35.203443728Z_seq1_minr751_maxr1251_orgn778.pces | |||||||||
| node4 | 6m 27.241s | 2025-11-17 21:58:01.485 | 1913 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 6m 27.245s | 2025-11-17 21:58:01.489 | 1914 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 6m 27.245s | 2025-11-17 21:58:01.489 | 1915 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 838 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/838 {"round":838,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/838/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 6m 27.247s | 2025-11-17 21:58:01.491 | 1916 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 | |
| node3 | 6m 27.253s | 2025-11-17 21:58:01.497 | 9740 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 838 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/838 | |
| node3 | 6m 27.253s | 2025-11-17 21:58:01.497 | 9741 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for com.swirlds.demo.consistency.ConsistencyTestingToolState@7bfb7037 | |
| node2 | 6m 27.280s | 2025-11-17 21:58:01.524 | 9705 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for com.swirlds.demo.consistency.ConsistencyTestingToolState@648bfa59 | |
| node2 | 6m 27.282s | 2025-11-17 21:58:01.526 | 9706 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 838 Timestamp: 2025-11-17T21:58:00.071623263Z Next consensus number: 25222 Legacy running event hash: 20b58560604e8b03f7b23ef2a081a72232b639ab04b0fdcc66c53628c8abdef49072d853ccde6ef0d3b613fe1e32812b Legacy running event mnemonic: birth-favorite-antenna-panic Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 515487362 Root hash: ecf991c82965026809da59be9e1b372fa51f761d37d7d7eea5b9471a5e883c840d00dc7630248e524b79a1b724680c02 (root) VirtualMap state / genius-shove-coach-escape | |||||||||
| node2 | 6m 27.290s | 2025-11-17 21:58:01.534 | 9707 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+51+49.432009215Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+55+33.456355536Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 6m 27.290s | 2025-11-17 21:58:01.534 | 9708 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 811 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+55+33.456355536Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 6m 27.292s | 2025-11-17 21:58:01.536 | 9709 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 6m 27.299s | 2025-11-17 21:58:01.543 | 9710 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 6m 27.299s | 2025-11-17 21:58:01.543 | 9711 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 838 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/838 {"round":838,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/838/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 6m 27.301s | 2025-11-17 21:58:01.545 | 9712 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/155 | |
| node3 | 6m 27.335s | 2025-11-17 21:58:01.579 | 9775 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for com.swirlds.demo.consistency.ConsistencyTestingToolState@7bfb7037 | |
| node3 | 6m 27.338s | 2025-11-17 21:58:01.582 | 9776 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 838 Timestamp: 2025-11-17T21:58:00.071623263Z Next consensus number: 25222 Legacy running event hash: 20b58560604e8b03f7b23ef2a081a72232b639ab04b0fdcc66c53628c8abdef49072d853ccde6ef0d3b613fe1e32812b Legacy running event mnemonic: birth-favorite-antenna-panic Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 515487362 Root hash: ecf991c82965026809da59be9e1b372fa51f761d37d7d7eea5b9471a5e883c840d00dc7630248e524b79a1b724680c02 (root) VirtualMap state / genius-shove-coach-escape | |||||||||
| node3 | 6m 27.343s | 2025-11-17 21:58:01.587 | 9777 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+51+49.340458921Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+55+33.456332427Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 6m 27.344s | 2025-11-17 21:58:01.588 | 9778 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 811 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+55+33.456332427Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 6m 27.347s | 2025-11-17 21:58:01.591 | 9779 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 6m 27.354s | 2025-11-17 21:58:01.598 | 9780 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 6m 27.355s | 2025-11-17 21:58:01.599 | 9781 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 838 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/838 {"round":838,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/838/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 6m 27.356s | 2025-11-17 21:58:01.600 | 9782 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/155 | |
| node1 | 7m 27.257s | 2025-11-17 21:59:01.501 | 11242 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 972 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node2 | 7m 27.259s | 2025-11-17 21:59:01.503 | 11191 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 972 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node0 | 7m 27.264s | 2025-11-17 21:59:01.508 | 11780 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 972 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node3 | 7m 27.284s | 2025-11-17 21:59:01.528 | 11260 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 972 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node4 | 7m 27.369s | 2025-11-17 21:59:01.613 | 3385 | INFO | STATE_TO_DISK | <<scheduler TransactionHandler>> | DefaultSavedStateController: | Signed state from round 972 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT | |
| node0 | 7m 27.428s | 2025-11-17 21:59:01.672 | 11793 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 972 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/972 | |
| node0 | 7m 27.429s | 2025-11-17 21:59:01.673 | 11794 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for com.swirlds.demo.consistency.ConsistencyTestingToolState@7bb6db7d | |
| node0 | 7m 27.516s | 2025-11-17 21:59:01.760 | 11833 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for com.swirlds.demo.consistency.ConsistencyTestingToolState@7bb6db7d | |
| node0 | 7m 27.518s | 2025-11-17 21:59:01.762 | 11834 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 972 Timestamp: 2025-11-17T21:59:00.351395Z Next consensus number: 30098 Legacy running event hash: 1c92f2c33b921e78589efcdd59ef4702f2eacf79e863725bf9b30f453674cec654afca2e9d56a73c659b254f00c1eb02 Legacy running event mnemonic: myself-police-inch-someone Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -508807805 Root hash: 4fd70cd2263177471d55f571f93c8eabaec4867f67ebc32d29c6e5cbb60e9f4f7a12a0094834a503a9beca0ba1d26ae5 (root) VirtualMap state / wife-olympic-shiver-trophy | |||||||||
| node4 | 7m 27.518s | 2025-11-17 21:59:01.762 | 3398 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 972 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/972 | |
| node4 | 7m 27.519s | 2025-11-17 21:59:01.763 | 3399 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/17 for com.swirlds.demo.consistency.ConsistencyTestingToolState@43a3b548 | |
| node0 | 7m 27.525s | 2025-11-17 21:59:01.769 | 11835 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+51+49.530997129Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+55+33.307680567Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node0 | 7m 27.525s | 2025-11-17 21:59:01.769 | 11836 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 945 File: data/saved/preconsensus-events/0/2025/11/17/2025-11-17T21+55+33.307680567Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node0 | 7m 27.525s | 2025-11-17 21:59:01.769 | 11837 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node0 | 7m 27.535s | 2025-11-17 21:59:01.779 | 11838 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node0 | 7m 27.535s | 2025-11-17 21:59:01.779 | 11839 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 972 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/972 {"round":972,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/972/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node0 | 7m 27.537s | 2025-11-17 21:59:01.781 | 11840 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/289 | |
| node2 | 7m 27.539s | 2025-11-17 21:59:01.783 | 11194 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 972 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/972 | |
| node2 | 7m 27.539s | 2025-11-17 21:59:01.783 | 11195 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for com.swirlds.demo.consistency.ConsistencyTestingToolState@70e5dddc | |
| node3 | 7m 27.589s | 2025-11-17 21:59:01.833 | 11263 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 972 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/972 | |
| node3 | 7m 27.590s | 2025-11-17 21:59:01.834 | 11264 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for com.swirlds.demo.consistency.ConsistencyTestingToolState@57aff42 | |
| node1 | 7m 27.609s | 2025-11-17 21:59:01.853 | 11245 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Started writing round 972 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/972 | |
| node1 | 7m 27.609s | 2025-11-17 21:59:01.853 | 11246 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/59 for com.swirlds.demo.consistency.ConsistencyTestingToolState@2edef0ac | |
| node2 | 7m 27.624s | 2025-11-17 21:59:01.868 | 11226 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for com.swirlds.demo.consistency.ConsistencyTestingToolState@70e5dddc | |
| node2 | 7m 27.626s | 2025-11-17 21:59:01.870 | 11227 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 972 Timestamp: 2025-11-17T21:59:00.351395Z Next consensus number: 30098 Legacy running event hash: 1c92f2c33b921e78589efcdd59ef4702f2eacf79e863725bf9b30f453674cec654afca2e9d56a73c659b254f00c1eb02 Legacy running event mnemonic: myself-police-inch-someone Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -508807805 Root hash: 4fd70cd2263177471d55f571f93c8eabaec4867f67ebc32d29c6e5cbb60e9f4f7a12a0094834a503a9beca0ba1d26ae5 (root) VirtualMap state / wife-olympic-shiver-trophy | |||||||||
| node2 | 7m 27.633s | 2025-11-17 21:59:01.877 | 11228 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+51+49.432009215Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+55+33.456355536Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 7m 27.633s | 2025-11-17 21:59:01.877 | 11229 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 945 File: data/saved/preconsensus-events/2/2025/11/17/2025-11-17T21+55+33.456355536Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node2 | 7m 27.633s | 2025-11-17 21:59:01.877 | 11230 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node2 | 7m 27.643s | 2025-11-17 21:59:01.887 | 11231 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node2 | 7m 27.643s | 2025-11-17 21:59:01.887 | 11232 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 972 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/972 {"round":972,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/972/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node2 | 7m 27.645s | 2025-11-17 21:59:01.889 | 11241 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/289 | |
| node4 | 7m 27.648s | 2025-11-17 21:59:01.892 | 3444 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/17 for com.swirlds.demo.consistency.ConsistencyTestingToolState@43a3b548 | |
| node4 | 7m 27.650s | 2025-11-17 21:59:01.894 | 3445 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 972 Timestamp: 2025-11-17T21:59:00.351395Z Next consensus number: 30098 Legacy running event hash: 1c92f2c33b921e78589efcdd59ef4702f2eacf79e863725bf9b30f453674cec654afca2e9d56a73c659b254f00c1eb02 Legacy running event mnemonic: myself-police-inch-someone Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -508807805 Root hash: 4fd70cd2263177471d55f571f93c8eabaec4867f67ebc32d29c6e5cbb60e9f4f7a12a0094834a503a9beca0ba1d26ae5 (root) VirtualMap state / wife-olympic-shiver-trophy | |||||||||
| node4 | 7m 27.658s | 2025-11-17 21:59:01.902 | 3446 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+51+49.234736072Z_seq0_minr1_maxr396_orgn0.pces Last file: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+57+35.203443728Z_seq1_minr751_maxr1251_orgn778.pces | |||||||||
| node4 | 7m 27.658s | 2025-11-17 21:59:01.902 | 3447 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 945 File: data/saved/preconsensus-events/4/2025/11/17/2025-11-17T21+57+35.203443728Z_seq1_minr751_maxr1251_orgn778.pces | |||||||||
| node4 | 7m 27.658s | 2025-11-17 21:59:01.902 | 3448 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node4 | 7m 27.664s | 2025-11-17 21:59:01.908 | 3449 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node4 | 7m 27.664s | 2025-11-17 21:59:01.908 | 3450 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 972 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/972 {"round":972,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/972/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node4 | 7m 27.665s | 2025-11-17 21:59:01.909 | 3451 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/21 | |
| node3 | 7m 27.672s | 2025-11-17 21:59:01.916 | 11303 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for com.swirlds.demo.consistency.ConsistencyTestingToolState@57aff42 | |
| node3 | 7m 27.674s | 2025-11-17 21:59:01.918 | 11304 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 972 Timestamp: 2025-11-17T21:59:00.351395Z Next consensus number: 30098 Legacy running event hash: 1c92f2c33b921e78589efcdd59ef4702f2eacf79e863725bf9b30f453674cec654afca2e9d56a73c659b254f00c1eb02 Legacy running event mnemonic: myself-police-inch-someone Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -508807805 Root hash: 4fd70cd2263177471d55f571f93c8eabaec4867f67ebc32d29c6e5cbb60e9f4f7a12a0094834a503a9beca0ba1d26ae5 (root) VirtualMap state / wife-olympic-shiver-trophy | |||||||||
| node3 | 7m 27.681s | 2025-11-17 21:59:01.925 | 11305 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+51+49.340458921Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+55+33.456332427Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 7m 27.681s | 2025-11-17 21:59:01.925 | 11306 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 945 File: data/saved/preconsensus-events/3/2025/11/17/2025-11-17T21+55+33.456332427Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node3 | 7m 27.681s | 2025-11-17 21:59:01.925 | 11307 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node3 | 7m 27.691s | 2025-11-17 21:59:01.935 | 11308 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node3 | 7m 27.691s | 2025-11-17 21:59:01.935 | 11309 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 972 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/972 {"round":972,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/972/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node3 | 7m 27.693s | 2025-11-17 21:59:01.937 | 11310 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/289 | |
| node1 | 7m 27.695s | 2025-11-17 21:59:01.939 | 11285 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | MerkleTreeSnapshotWriter: | Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/59 for com.swirlds.demo.consistency.ConsistencyTestingToolState@2edef0ac | |
| node1 | 7m 27.697s | 2025-11-17 21:59:01.941 | 11286 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Information for state written to disk: | |
| Round: 972 Timestamp: 2025-11-17T21:59:00.351395Z Next consensus number: 30098 Legacy running event hash: 1c92f2c33b921e78589efcdd59ef4702f2eacf79e863725bf9b30f453674cec654afca2e9d56a73c659b254f00c1eb02 Legacy running event mnemonic: myself-police-inch-someone Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -508807805 Root hash: 4fd70cd2263177471d55f571f93c8eabaec4867f67ebc32d29c6e5cbb60e9f4f7a12a0094834a503a9beca0ba1d26ae5 (root) VirtualMap state / wife-olympic-shiver-trophy | |||||||||
| node1 | 7m 27.703s | 2025-11-17 21:59:01.947 | 11287 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 2 preconsensus files on disk. | |
| First file: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+51+49.359203665Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+55+33.419738959Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 7m 27.703s | 2025-11-17 21:59:01.947 | 11288 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Found 1 preconsensus event file meeting specified criteria to copy. | |
| Lower bound: 945 File: data/saved/preconsensus-events/1/2025/11/17/2025-11-17T21+55+33.419738959Z_seq1_minr474_maxr5474_orgn0.pces | |||||||||
| node1 | 7m 27.704s | 2025-11-17 21:59:01.948 | 11289 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Copying 1 preconsensus event file(s) | |
| node1 | 7m 27.713s | 2025-11-17 21:59:01.957 | 11290 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | BestEffortPcesFileCopy: | Finished copying 1 preconsensus event file(s) | |
| node1 | 7m 27.714s | 2025-11-17 21:59:01.958 | 11291 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | SignedStateFileWriter: | Finished writing state for round 972 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/972 {"round":972,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/972/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload] | |
| node1 | 7m 27.715s | 2025-11-17 21:59:01.959 | 11292 | INFO | STATE_TO_DISK | <<scheduler StateSnapshotManager>> | FileUtils: | deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/289 | |
| node2 | 7m 57.119s | 2025-11-17 21:59:31.363 | 11933 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 2 to 4>> | NetworkUtils: | Connection broken: 2 -> 4 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:59:31.362172080Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node0 | 7m 57.120s | 2025-11-17 21:59:31.364 | 12546 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith4 0 to 4>> | NetworkUtils: | Connection broken: 0 -> 4 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:59:31.362530737Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node0 | 7m 57.345s | 2025-11-17 21:59:31.589 | 12555 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith3 0 to 3>> | NetworkUtils: | Connection broken: 0 -> 3 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:59:31.589252876Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node0 | 7m 57.388s | 2025-11-17 21:59:31.632 | 12556 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith1 0 to 1>> | NetworkUtils: | Connection broken: 0 -> 1 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:59:31.628490181Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||
| node0 | 7m 57.513s | 2025-11-17 21:59:31.757 | 12557 | WARN | SOCKET_EXCEPTIONS | <<platform-core: SyncProtocolWith2 0 to 2>> | NetworkUtils: | Connection broken: 0 -> 2 | |
| com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-17T21:59:31.755737440Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more | |||||||||