Node ID







Columns











Log Level





Log Marker








Class
















































node1 0.000ns 2025-11-20 18:09:48.954 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node1 95.000ms 2025-11-20 18:09:49.049 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node1 111.000ms 2025-11-20 18:09:49.065 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 229.000ms 2025-11-20 18:09:49.183 4 INFO STARTUP <main> Browser: The following nodes [1] are set to run locally
node1 255.000ms 2025-11-20 18:09:49.209 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node2 1.231s 2025-11-20 18:09:50.185 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node2 1.325s 2025-11-20 18:09:50.279 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node2 1.342s 2025-11-20 18:09:50.296 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 1.355s 2025-11-20 18:09:50.309 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node2 1.457s 2025-11-20 18:09:50.411 4 INFO STARTUP <main> Browser: The following nodes [2] are set to run locally
node4 1.458s 2025-11-20 18:09:50.412 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 1.477s 2025-11-20 18:09:50.431 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 1.485s 2025-11-20 18:09:50.439 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node1 1.541s 2025-11-20 18:09:50.495 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1285ms
node1 1.550s 2025-11-20 18:09:50.504 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node1 1.553s 2025-11-20 18:09:50.507 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 1.596s 2025-11-20 18:09:50.550 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 1.600s 2025-11-20 18:09:50.554 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 1.629s 2025-11-20 18:09:50.583 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node1 1.657s 2025-11-20 18:09:50.611 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node1 1.658s 2025-11-20 18:09:50.612 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node3 2.043s 2025-11-20 18:09:50.997 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node3 2.145s 2025-11-20 18:09:51.099 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node3 2.162s 2025-11-20 18:09:51.116 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 2.280s 2025-11-20 18:09:51.234 4 INFO STARTUP <main> Browser: The following nodes [3] are set to run locally
node3 2.307s 2025-11-20 18:09:51.261 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node1 2.472s 2025-11-20 18:09:51.426 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node1 2.558s 2025-11-20 18:09:51.512 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 2.561s 2025-11-20 18:09:51.515 16 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node0 2.593s 2025-11-20 18:09:51.547 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node1 2.595s 2025-11-20 18:09:51.549 21 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 2.703s 2025-11-20 18:09:51.657 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node0 2.722s 2025-11-20 18:09:51.676 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 2.848s 2025-11-20 18:09:51.802 4 INFO STARTUP <main> Browser: The following nodes [0] are set to run locally
node0 2.877s 2025-11-20 18:09:51.831 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node2 3.163s 2025-11-20 18:09:52.117 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1677ms
node2 3.172s 2025-11-20 18:09:52.126 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node2 3.175s 2025-11-20 18:09:52.129 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 3.217s 2025-11-20 18:09:52.171 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node2 3.281s 2025-11-20 18:09:52.235 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node2 3.282s 2025-11-20 18:09:52.236 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 3.299s 2025-11-20 18:09:52.253 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1670ms
node4 3.309s 2025-11-20 18:09:52.263 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 3.312s 2025-11-20 18:09:52.266 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 3.344s 2025-11-20 18:09:52.298 24 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 3.346s 2025-11-20 18:09:52.300 27 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node1 3.353s 2025-11-20 18:09:52.307 28 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node4 3.356s 2025-11-20 18:09:52.310 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node1 3.364s 2025-11-20 18:09:52.318 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 3.366s 2025-11-20 18:09:52.320 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 3.429s 2025-11-20 18:09:52.383 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 3.430s 2025-11-20 18:09:52.384 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node3 4.066s 2025-11-20 18:09:53.020 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1758ms
node3 4.074s 2025-11-20 18:09:53.028 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node3 4.077s 2025-11-20 18:09:53.031 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 4.117s 2025-11-20 18:09:53.071 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node2 4.194s 2025-11-20 18:09:53.148 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node3 4.195s 2025-11-20 18:09:53.149 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node3 4.197s 2025-11-20 18:09:53.151 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node2 4.289s 2025-11-20 18:09:53.243 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.292s 2025-11-20 18:09:53.246 16 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node2 4.328s 2025-11-20 18:09:53.282 21 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 4.377s 2025-11-20 18:09:53.331 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node1 4.477s 2025-11-20 18:09:53.431 31 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26204877] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=213800, randomLong=-5651430619413627832, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=11130, randomLong=-6927270373371874015, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1437500, data=35, exception=null] OS Health Check Report - Complete (took 1022 ms)
node4 4.479s 2025-11-20 18:09:53.433 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.482s 2025-11-20 18:09:53.436 16 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node0 4.487s 2025-11-20 18:09:53.441 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1609ms
node0 4.496s 2025-11-20 18:09:53.450 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node0 4.499s 2025-11-20 18:09:53.453 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 4.506s 2025-11-20 18:09:53.460 32 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node1 4.514s 2025-11-20 18:09:53.468 33 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node1 4.515s 2025-11-20 18:09:53.469 34 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 4.522s 2025-11-20 18:09:53.476 21 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 4.545s 2025-11-20 18:09:53.499 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node1 4.601s 2025-11-20 18:09:53.555 35 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "iHFhtA==", "port": 30124 }, { "ipAddressV4": "CoAAEQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "IkW49A==", "port": 30125 }, { "ipAddressV4": "CoAAGA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "IhBJhQ==", "port": 30126 }, { "ipAddressV4": "CoAALA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "IoiZJQ==", "port": 30127 }, { "ipAddressV4": "CoAAKQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "Ink6nQ==", "port": 30128 }, { "ipAddressV4": "CoAAJQ==", "port": 30128 }] }] }
node0 4.615s 2025-11-20 18:09:53.569 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node0 4.616s 2025-11-20 18:09:53.570 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node1 4.625s 2025-11-20 18:09:53.579 36 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/1/ConsistencyTestLog.csv
node1 4.626s 2025-11-20 18:09:53.580 37 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node1 4.641s 2025-11-20 18:09:53.595 38 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: e55cb0feb26a5e52b793a5db159084e9a1ab2ad3b3045a9e762a1d28d641344e0d63488bafd027ea0dc5f2e339f0f936 (root) VirtualMap state / fee-theory-ready-ski {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":2,"lastLeafPath":4},"Singletons":{"RosterService.ROSTER_STATE":{"path":2,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":3,"mnemonic":"normal-stage-book-frozen"}}}
node1 4.644s 2025-11-20 18:09:53.598 40 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node1 4.857s 2025-11-20 18:09:53.811 41 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node1 4.861s 2025-11-20 18:09:53.815 42 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node1 4.865s 2025-11-20 18:09:53.819 43 INFO STARTUP <<start-node-1>> ConsistencyTestingToolMain: init called in Main for node 1.
node1 4.866s 2025-11-20 18:09:53.820 44 INFO STARTUP <<start-node-1>> SwirldsPlatform: Starting platform 1
node1 4.867s 2025-11-20 18:09:53.821 45 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node1 4.870s 2025-11-20 18:09:53.824 46 INFO STARTUP <<start-node-1>> CycleFinder: No cyclical back pressure detected in wiring model.
node1 4.871s 2025-11-20 18:09:53.825 47 INFO STARTUP <<start-node-1>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node1 4.872s 2025-11-20 18:09:53.826 48 INFO STARTUP <<start-node-1>> InputWireChecks: All input wires have been bound.
node1 4.873s 2025-11-20 18:09:53.827 49 WARN STARTUP <<start-node-1>> PcesFileTracker: No preconsensus event files available
node1 4.873s 2025-11-20 18:09:53.827 50 INFO STARTUP <<start-node-1>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node1 4.875s 2025-11-20 18:09:53.829 51 INFO STARTUP <<start-node-1>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node1 4.876s 2025-11-20 18:09:53.830 52 INFO STARTUP <<app: appMain 1>> ConsistencyTestingToolMain: run called in Main.
node1 4.878s 2025-11-20 18:09:53.832 53 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 182.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node1 4.882s 2025-11-20 18:09:53.836 54 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node3 5.067s 2025-11-20 18:09:54.021 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node3 5.157s 2025-11-20 18:09:54.111 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 5.159s 2025-11-20 18:09:54.113 16 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node2 5.163s 2025-11-20 18:09:54.117 24 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 5.165s 2025-11-20 18:09:54.119 26 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node2 5.171s 2025-11-20 18:09:54.125 28 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node2 5.181s 2025-11-20 18:09:54.135 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 5.184s 2025-11-20 18:09:54.138 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 5.196s 2025-11-20 18:09:54.150 21 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5.390s 2025-11-20 18:09:54.344 24 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.392s 2025-11-20 18:09:54.346 27 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 5.398s 2025-11-20 18:09:54.352 28 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node4 5.408s 2025-11-20 18:09:54.362 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.411s 2025-11-20 18:09:54.365 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 5.528s 2025-11-20 18:09:54.482 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node0 5.637s 2025-11-20 18:09:54.591 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 5.641s 2025-11-20 18:09:54.595 16 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node0 5.689s 2025-11-20 18:09:54.643 21 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 6.018s 2025-11-20 18:09:54.972 24 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 6.020s 2025-11-20 18:09:54.974 27 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node3 6.026s 2025-11-20 18:09:54.980 28 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node3 6.038s 2025-11-20 18:09:54.992 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 6.041s 2025-11-20 18:09:54.995 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 6.310s 2025-11-20 18:09:55.264 31 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26365765] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=216230, randomLong=2570420901621353433, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=8211, randomLong=4328592305454152332, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1432819, data=35, exception=null] OS Health Check Report - Complete (took 1025 ms)
node2 6.343s 2025-11-20 18:09:55.297 32 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node2 6.353s 2025-11-20 18:09:55.307 33 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node2 6.355s 2025-11-20 18:09:55.309 34 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node2 6.450s 2025-11-20 18:09:55.404 35 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "iHFhtA==", "port": 30124 }, { "ipAddressV4": "CoAAEQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "IkW49A==", "port": 30125 }, { "ipAddressV4": "CoAAGA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "IhBJhQ==", "port": 30126 }, { "ipAddressV4": "CoAALA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "IoiZJQ==", "port": 30127 }, { "ipAddressV4": "CoAAKQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "Ink6nQ==", "port": 30128 }, { "ipAddressV4": "CoAAJQ==", "port": 30128 }] }] }
node2 6.475s 2025-11-20 18:09:55.429 36 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/2/ConsistencyTestLog.csv
node2 6.476s 2025-11-20 18:09:55.430 37 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node2 6.494s 2025-11-20 18:09:55.448 38 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: e55cb0feb26a5e52b793a5db159084e9a1ab2ad3b3045a9e762a1d28d641344e0d63488bafd027ea0dc5f2e339f0f936 (root) VirtualMap state / fee-theory-ready-ski {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":2,"lastLeafPath":4},"Singletons":{"RosterService.ROSTER_STATE":{"path":2,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":3,"mnemonic":"normal-stage-book-frozen"}}}
node2 6.497s 2025-11-20 18:09:55.451 40 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node4 6.534s 2025-11-20 18:09:55.488 31 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26259995] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=220310, randomLong=8476900504957603142, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=14049, randomLong=464915218121496613, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1530991, data=35, exception=null] OS Health Check Report - Complete (took 1025 ms)
node0 6.563s 2025-11-20 18:09:55.517 24 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 6.565s 2025-11-20 18:09:55.519 27 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 6.567s 2025-11-20 18:09:55.521 32 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node0 6.572s 2025-11-20 18:09:55.526 28 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node4 6.576s 2025-11-20 18:09:55.530 33 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 6.579s 2025-11-20 18:09:55.533 34 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node0 6.584s 2025-11-20 18:09:55.538 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 6.587s 2025-11-20 18:09:55.541 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 6.678s 2025-11-20 18:09:55.632 35 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "iHFhtA==", "port": 30124 }, { "ipAddressV4": "CoAAEQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "IkW49A==", "port": 30125 }, { "ipAddressV4": "CoAAGA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "IhBJhQ==", "port": 30126 }, { "ipAddressV4": "CoAALA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "IoiZJQ==", "port": 30127 }, { "ipAddressV4": "CoAAKQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "Ink6nQ==", "port": 30128 }, { "ipAddressV4": "CoAAJQ==", "port": 30128 }] }] }
node4 6.703s 2025-11-20 18:09:55.657 36 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6.704s 2025-11-20 18:09:55.658 37 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node2 6.706s 2025-11-20 18:09:55.660 41 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node2 6.712s 2025-11-20 18:09:55.666 42 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node2 6.717s 2025-11-20 18:09:55.671 43 INFO STARTUP <<start-node-2>> ConsistencyTestingToolMain: init called in Main for node 2.
node2 6.718s 2025-11-20 18:09:55.672 44 INFO STARTUP <<start-node-2>> SwirldsPlatform: Starting platform 2
node2 6.719s 2025-11-20 18:09:55.673 45 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 6.721s 2025-11-20 18:09:55.675 38 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: e55cb0feb26a5e52b793a5db159084e9a1ab2ad3b3045a9e762a1d28d641344e0d63488bafd027ea0dc5f2e339f0f936 (root) VirtualMap state / fee-theory-ready-ski {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":2,"lastLeafPath":4},"Singletons":{"RosterService.ROSTER_STATE":{"path":2,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":3,"mnemonic":"normal-stage-book-frozen"}}}
node2 6.723s 2025-11-20 18:09:55.677 46 INFO STARTUP <<start-node-2>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 6.724s 2025-11-20 18:09:55.678 40 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node2 6.725s 2025-11-20 18:09:55.679 47 INFO STARTUP <<start-node-2>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node2 6.725s 2025-11-20 18:09:55.679 48 INFO STARTUP <<start-node-2>> InputWireChecks: All input wires have been bound.
node2 6.727s 2025-11-20 18:09:55.681 49 WARN STARTUP <<start-node-2>> PcesFileTracker: No preconsensus event files available
node2 6.728s 2025-11-20 18:09:55.682 50 INFO STARTUP <<start-node-2>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node2 6.729s 2025-11-20 18:09:55.683 51 INFO STARTUP <<start-node-2>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node2 6.730s 2025-11-20 18:09:55.684 52 INFO STARTUP <<app: appMain 2>> ConsistencyTestingToolMain: run called in Main.
node2 6.733s 2025-11-20 18:09:55.687 53 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 178.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node2 6.738s 2025-11-20 18:09:55.692 54 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 6.942s 2025-11-20 18:09:55.896 41 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node4 6.947s 2025-11-20 18:09:55.901 42 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node4 6.951s 2025-11-20 18:09:55.905 43 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 6.951s 2025-11-20 18:09:55.905 44 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 6.953s 2025-11-20 18:09:55.907 45 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 6.956s 2025-11-20 18:09:55.910 46 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 6.957s 2025-11-20 18:09:55.911 47 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 6.958s 2025-11-20 18:09:55.912 48 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 6.959s 2025-11-20 18:09:55.913 49 WARN STARTUP <<start-node-4>> PcesFileTracker: No preconsensus event files available
node4 6.960s 2025-11-20 18:09:55.914 50 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node4 6.961s 2025-11-20 18:09:55.915 51 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node4 6.962s 2025-11-20 18:09:55.916 52 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 6.964s 2025-11-20 18:09:55.918 53 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 184.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 6.969s 2025-11-20 18:09:55.923 54 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node3 7.174s 2025-11-20 18:09:56.128 31 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26287840] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=257209, randomLong=-5506794415744951807, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=13520, randomLong=7296903147260183285, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1455609, data=35, exception=null] OS Health Check Report - Complete (took 1027 ms)
node3 7.206s 2025-11-20 18:09:56.160 32 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node3 7.215s 2025-11-20 18:09:56.169 33 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node3 7.217s 2025-11-20 18:09:56.171 34 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node3 7.312s 2025-11-20 18:09:56.266 35 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "iHFhtA==", "port": 30124 }, { "ipAddressV4": "CoAAEQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "IkW49A==", "port": 30125 }, { "ipAddressV4": "CoAAGA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "IhBJhQ==", "port": 30126 }, { "ipAddressV4": "CoAALA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "IoiZJQ==", "port": 30127 }, { "ipAddressV4": "CoAAKQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "Ink6nQ==", "port": 30128 }, { "ipAddressV4": "CoAAJQ==", "port": 30128 }] }] }
node3 7.338s 2025-11-20 18:09:56.292 36 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/3/ConsistencyTestLog.csv
node3 7.339s 2025-11-20 18:09:56.293 37 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node3 7.357s 2025-11-20 18:09:56.311 38 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: e55cb0feb26a5e52b793a5db159084e9a1ab2ad3b3045a9e762a1d28d641344e0d63488bafd027ea0dc5f2e339f0f936 (root) VirtualMap state / fee-theory-ready-ski {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":2,"lastLeafPath":4},"Singletons":{"RosterService.ROSTER_STATE":{"path":2,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":3,"mnemonic":"normal-stage-book-frozen"}}}
node3 7.360s 2025-11-20 18:09:56.314 40 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node3 7.581s 2025-11-20 18:09:56.535 41 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node3 7.586s 2025-11-20 18:09:56.540 42 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node3 7.592s 2025-11-20 18:09:56.546 43 INFO STARTUP <<start-node-3>> ConsistencyTestingToolMain: init called in Main for node 3.
node3 7.593s 2025-11-20 18:09:56.547 44 INFO STARTUP <<start-node-3>> SwirldsPlatform: Starting platform 3
node3 7.594s 2025-11-20 18:09:56.548 45 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node3 7.598s 2025-11-20 18:09:56.552 46 INFO STARTUP <<start-node-3>> CycleFinder: No cyclical back pressure detected in wiring model.
node3 7.599s 2025-11-20 18:09:56.553 47 INFO STARTUP <<start-node-3>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node3 7.600s 2025-11-20 18:09:56.554 48 INFO STARTUP <<start-node-3>> InputWireChecks: All input wires have been bound.
node3 7.602s 2025-11-20 18:09:56.556 49 WARN STARTUP <<start-node-3>> PcesFileTracker: No preconsensus event files available
node3 7.603s 2025-11-20 18:09:56.557 50 INFO STARTUP <<start-node-3>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node3 7.605s 2025-11-20 18:09:56.559 51 INFO STARTUP <<start-node-3>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node3 7.606s 2025-11-20 18:09:56.560 52 INFO STARTUP <<app: appMain 3>> ConsistencyTestingToolMain: run called in Main.
node3 7.608s 2025-11-20 18:09:56.562 53 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 188.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node3 7.616s 2025-11-20 18:09:56.570 54 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 5.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node0 7.719s 2025-11-20 18:09:56.673 31 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26168384] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=288710, randomLong=-461044848898853914, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=30450, randomLong=937332634199733900, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1876800, data=35, exception=null] OS Health Check Report - Complete (took 1031 ms)
node0 7.755s 2025-11-20 18:09:56.709 32 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node0 7.765s 2025-11-20 18:09:56.719 33 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node0 7.767s 2025-11-20 18:09:56.721 34 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node0 7.869s 2025-11-20 18:09:56.823 35 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "iHFhtA==", "port": 30124 }, { "ipAddressV4": "CoAAEQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "IkW49A==", "port": 30125 }, { "ipAddressV4": "CoAAGA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "IhBJhQ==", "port": 30126 }, { "ipAddressV4": "CoAALA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "IoiZJQ==", "port": 30127 }, { "ipAddressV4": "CoAAKQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "Ink6nQ==", "port": 30128 }, { "ipAddressV4": "CoAAJQ==", "port": 30128 }] }] }
node1 7.876s 2025-11-20 18:09:56.830 55 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting1.csv' ]
node1 7.879s 2025-11-20 18:09:56.833 56 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node0 7.896s 2025-11-20 18:09:56.850 36 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/0/ConsistencyTestLog.csv
node0 7.896s 2025-11-20 18:09:56.850 37 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node0 7.915s 2025-11-20 18:09:56.869 38 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: e55cb0feb26a5e52b793a5db159084e9a1ab2ad3b3045a9e762a1d28d641344e0d63488bafd027ea0dc5f2e339f0f936 (root) VirtualMap state / fee-theory-ready-ski {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":2,"lastLeafPath":4},"Singletons":{"RosterService.ROSTER_STATE":{"path":2,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":3,"mnemonic":"normal-stage-book-frozen"}}}
node0 7.919s 2025-11-20 18:09:56.873 40 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node0 8.179s 2025-11-20 18:09:57.133 41 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node0 8.186s 2025-11-20 18:09:57.140 42 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node0 8.193s 2025-11-20 18:09:57.147 43 INFO STARTUP <<start-node-0>> ConsistencyTestingToolMain: init called in Main for node 0.
node0 8.194s 2025-11-20 18:09:57.148 44 INFO STARTUP <<start-node-0>> SwirldsPlatform: Starting platform 0
node0 8.195s 2025-11-20 18:09:57.149 45 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node0 8.200s 2025-11-20 18:09:57.154 46 INFO STARTUP <<start-node-0>> CycleFinder: No cyclical back pressure detected in wiring model.
node0 8.202s 2025-11-20 18:09:57.156 47 INFO STARTUP <<start-node-0>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node0 8.203s 2025-11-20 18:09:57.157 48 INFO STARTUP <<start-node-0>> InputWireChecks: All input wires have been bound.
node0 8.205s 2025-11-20 18:09:57.159 49 WARN STARTUP <<start-node-0>> PcesFileTracker: No preconsensus event files available
node0 8.206s 2025-11-20 18:09:57.160 50 INFO STARTUP <<start-node-0>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node0 8.208s 2025-11-20 18:09:57.162 51 INFO STARTUP <<start-node-0>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node0 8.210s 2025-11-20 18:09:57.164 52 INFO STARTUP <<app: appMain 0>> ConsistencyTestingToolMain: run called in Main.
node0 8.210s 2025-11-20 18:09:57.164 53 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 224.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node0 8.216s 2025-11-20 18:09:57.170 54 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 5.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node2 9.729s 2025-11-20 18:09:58.683 55 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting2.csv' ]
node2 9.732s 2025-11-20 18:09:58.686 56 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 9.965s 2025-11-20 18:09:58.919 55 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 9.968s 2025-11-20 18:09:58.922 56 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node3 10.605s 2025-11-20 18:09:59.559 55 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting3.csv' ]
node3 10.608s 2025-11-20 18:09:59.562 56 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node0 11.213s 2025-11-20 18:10:00.167 55 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting0.csv' ]
node0 11.218s 2025-11-20 18:10:00.172 56 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node1 14.974s 2025-11-20 18:10:03.928 57 INFO PLATFORM_STATUS <platformForkJoinThread-2> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node2 16.827s 2025-11-20 18:10:05.781 57 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node4 17.059s 2025-11-20 18:10:06.013 57 INFO PLATFORM_STATUS <platformForkJoinThread-2> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node3 17.702s 2025-11-20 18:10:06.656 57 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node0 18.305s 2025-11-20 18:10:07.259 57 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node1 18.697s 2025-11-20 18:10:07.651 58 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 3.7 s in CHECKING. Now in ACTIVE
node1 18.699s 2025-11-20 18:10:07.653 60 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node2 18.757s 2025-11-20 18:10:07.711 59 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node0 18.782s 2025-11-20 18:10:07.736 59 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node3 18.782s 2025-11-20 18:10:07.736 59 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node4 18.870s 2025-11-20 18:10:07.824 59 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node3 18.898s 2025-11-20 18:10:07.852 74 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1
node3 18.900s 2025-11-20 18:10:07.854 75 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for 1
node0 18.918s 2025-11-20 18:10:07.872 74 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1
node0 18.920s 2025-11-20 18:10:07.874 75 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for 1
node2 18.951s 2025-11-20 18:10:07.905 74 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1
node4 18.952s 2025-11-20 18:10:07.906 74 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1
node2 18.953s 2025-11-20 18:10:07.907 75 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for 1
node4 18.953s 2025-11-20 18:10:07.907 75 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for 1
node1 18.971s 2025-11-20 18:10:07.925 75 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1
node1 18.973s 2025-11-20 18:10:07.927 76 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for 1
node2 19.177s 2025-11-20 18:10:08.131 108 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 2.3 s in CHECKING. Now in ACTIVE
node3 19.186s 2025-11-20 18:10:08.140 107 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for 1
node3 19.189s 2025-11-20 18:10:08.143 108 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-11-20T18:10:04.264780348Z Next consensus number: 1 Legacy running event hash: 79d2c46d630f836deb89ed7e9dd64704c77f01270ab9a410db32c957ca2a5fa57480407972257bdeef3459a87f745a58 Legacy running event mnemonic: demise-include-giggle-mystery Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 2cdcb39de598c014d8fc5c47cb9c7402247d2f1d2af1f2c07677d4aea4d807da57f8931108e9d263640732cd3ff2e028 (root) VirtualMap state / oppose-midnight-escape-work {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"vanish-alpha-injury-grocery"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"paddle-robust-token-stove"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"vehicle-interest-purchase-barrel"}}}
node3 19.192s 2025-11-20 18:10:08.146 110 INFO PLATFORM_STATUS <platformForkJoinThread-8> StatusStateMachine: Platform spent 1.5 s in CHECKING. Now in ACTIVE
node2 19.193s 2025-11-20 18:10:08.147 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for 1
node0 19.196s 2025-11-20 18:10:08.150 108 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for 1
node2 19.196s 2025-11-20 18:10:08.150 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-11-20T18:10:04.264780348Z Next consensus number: 1 Legacy running event hash: 79d2c46d630f836deb89ed7e9dd64704c77f01270ab9a410db32c957ca2a5fa57480407972257bdeef3459a87f745a58 Legacy running event mnemonic: demise-include-giggle-mystery Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 2cdcb39de598c014d8fc5c47cb9c7402247d2f1d2af1f2c07677d4aea4d807da57f8931108e9d263640732cd3ff2e028 (root) VirtualMap state / oppose-midnight-escape-work {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"vanish-alpha-injury-grocery"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"paddle-robust-token-stove"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"vehicle-interest-purchase-barrel"}}}
node0 19.200s 2025-11-20 18:10:08.154 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-11-20T18:10:04.264780348Z Next consensus number: 1 Legacy running event hash: 79d2c46d630f836deb89ed7e9dd64704c77f01270ab9a410db32c957ca2a5fa57480407972257bdeef3459a87f745a58 Legacy running event mnemonic: demise-include-giggle-mystery Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 2cdcb39de598c014d8fc5c47cb9c7402247d2f1d2af1f2c07677d4aea4d807da57f8931108e9d263640732cd3ff2e028 (root) VirtualMap state / oppose-midnight-escape-work {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"vanish-alpha-injury-grocery"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"paddle-robust-token-stove"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"vehicle-interest-purchase-barrel"}}}
node1 19.208s 2025-11-20 18:10:08.162 108 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for 1
node1 19.211s 2025-11-20 18:10:08.165 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-11-20T18:10:04.264780348Z Next consensus number: 1 Legacy running event hash: 79d2c46d630f836deb89ed7e9dd64704c77f01270ab9a410db32c957ca2a5fa57480407972257bdeef3459a87f745a58 Legacy running event mnemonic: demise-include-giggle-mystery Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 2cdcb39de598c014d8fc5c47cb9c7402247d2f1d2af1f2c07677d4aea4d807da57f8931108e9d263640732cd3ff2e028 (root) VirtualMap state / oppose-midnight-escape-work {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"vanish-alpha-injury-grocery"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"paddle-robust-token-stove"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"vehicle-interest-purchase-barrel"}}}
node4 19.217s 2025-11-20 18:10:08.171 106 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for 1
node4 19.220s 2025-11-20 18:10:08.174 107 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-11-20T18:10:04.264780348Z Next consensus number: 1 Legacy running event hash: 79d2c46d630f836deb89ed7e9dd64704c77f01270ab9a410db32c957ca2a5fa57480407972257bdeef3459a87f745a58 Legacy running event mnemonic: demise-include-giggle-mystery Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 2cdcb39de598c014d8fc5c47cb9c7402247d2f1d2af1f2c07677d4aea4d807da57f8931108e9d263640732cd3ff2e028 (root) VirtualMap state / oppose-midnight-escape-work {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"vanish-alpha-injury-grocery"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"paddle-robust-token-stove"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"vehicle-interest-purchase-barrel"}}}
node3 19.227s 2025-11-20 18:10:08.181 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+10+04.243478241Z_seq0_minr1_maxr501_orgn0.pces
node3 19.228s 2025-11-20 18:10:08.182 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+10+04.243478241Z_seq0_minr1_maxr501_orgn0.pces
node3 19.228s 2025-11-20 18:10:08.182 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 19.230s 2025-11-20 18:10:08.184 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 19.236s 2025-11-20 18:10:08.190 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 19.239s 2025-11-20 18:10:08.193 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+10+04.273287088Z_seq0_minr1_maxr501_orgn0.pces
node2 19.240s 2025-11-20 18:10:08.194 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+10+04.273287088Z_seq0_minr1_maxr501_orgn0.pces
node2 19.240s 2025-11-20 18:10:08.194 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 19.242s 2025-11-20 18:10:08.196 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 19.248s 2025-11-20 18:10:08.202 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 19.253s 2025-11-20 18:10:08.207 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+10+04.284523695Z_seq0_minr1_maxr501_orgn0.pces
node1 19.254s 2025-11-20 18:10:08.208 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+10+03.963750774Z_seq0_minr1_maxr501_orgn0.pces
node1 19.254s 2025-11-20 18:10:08.208 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+10+03.963750774Z_seq0_minr1_maxr501_orgn0.pces
node0 19.255s 2025-11-20 18:10:08.209 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+10+04.284523695Z_seq0_minr1_maxr501_orgn0.pces
node0 19.255s 2025-11-20 18:10:08.209 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 19.255s 2025-11-20 18:10:08.209 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 19.256s 2025-11-20 18:10:08.210 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 19.257s 2025-11-20 18:10:08.211 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 19.262s 2025-11-20 18:10:08.216 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 19.265s 2025-11-20 18:10:08.219 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 19.266s 2025-11-20 18:10:08.220 108 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/11/20/2025-11-20T18+10+04.145266281Z_seq0_minr1_maxr501_orgn0.pces
node4 19.267s 2025-11-20 18:10:08.221 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/4/2025/11/20/2025-11-20T18+10+04.145266281Z_seq0_minr1_maxr501_orgn0.pces
node4 19.267s 2025-11-20 18:10:08.221 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 19.269s 2025-11-20 18:10:08.223 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 19.275s 2025-11-20 18:10:08.229 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 19.294s 2025-11-20 18:10:08.248 114 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 2.2 s in CHECKING. Now in ACTIVE
node0 19.642s 2025-11-20 18:10:08.596 126 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 1.3 s in CHECKING. Now in ACTIVE
node2 1m 12.753s 2025-11-20 18:11:01.707 1372 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 120 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 1m 12.780s 2025-11-20 18:11:01.734 1388 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 120 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 1m 12.826s 2025-11-20 18:11:01.780 1377 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 120 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 1m 12.859s 2025-11-20 18:11:01.813 1397 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 120 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 1m 12.867s 2025-11-20 18:11:01.821 1386 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 120 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 1m 12.955s 2025-11-20 18:11:01.909 1383 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 120 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/120
node3 1m 12.956s 2025-11-20 18:11:01.910 1384 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for 120
node1 1m 13.005s 2025-11-20 18:11:01.959 1403 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 120 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/120
node1 1m 13.006s 2025-11-20 18:11:01.960 1404 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for 120
node3 1m 13.052s 2025-11-20 18:11:02.006 1427 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for 120
node3 1m 13.054s 2025-11-20 18:11:02.008 1428 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 120 Timestamp: 2025-11-20T18:11:00.442893Z Next consensus number: 4297 Legacy running event hash: 645651750a3eac57f8534eb19cf100c270f3378b675a8e733821ec79dbdf8ca2215ccf63a4e42bdc29eedc0bf769a467 Legacy running event mnemonic: member-concert-elephant-inside Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2112197699 Root hash: 5394730fd3e0286481b0bdce02c3818264f3eb8cc514cd7747da176c531e3ad666916858bdac6683e509d71f41aec78b (root) VirtualMap state / income-giggle-lottery-arrive {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"aim-repeat-pizza-layer"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"skull-rapid-fan-six"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"kiss-stable-absent-radio"}}}
node3 1m 13.063s 2025-11-20 18:11:02.017 1429 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+10+04.243478241Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 13.064s 2025-11-20 18:11:02.018 1430 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 93 File: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+10+04.243478241Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 13.064s 2025-11-20 18:11:02.018 1431 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 1m 13.068s 2025-11-20 18:11:02.022 1432 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 1m 13.068s 2025-11-20 18:11:02.022 1433 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 120 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/120 {"round":120,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/120/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 1m 13.076s 2025-11-20 18:11:02.030 1378 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 120 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/120
node2 1m 13.077s 2025-11-20 18:11:02.031 1379 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for 120
node1 1m 13.089s 2025-11-20 18:11:02.043 1447 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for 120
node1 1m 13.091s 2025-11-20 18:11:02.045 1448 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 120 Timestamp: 2025-11-20T18:11:00.442893Z Next consensus number: 4297 Legacy running event hash: 645651750a3eac57f8534eb19cf100c270f3378b675a8e733821ec79dbdf8ca2215ccf63a4e42bdc29eedc0bf769a467 Legacy running event mnemonic: member-concert-elephant-inside Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2112197699 Root hash: 5394730fd3e0286481b0bdce02c3818264f3eb8cc514cd7747da176c531e3ad666916858bdac6683e509d71f41aec78b (root) VirtualMap state / income-giggle-lottery-arrive {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"aim-repeat-pizza-layer"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"skull-rapid-fan-six"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"kiss-stable-absent-radio"}}}
node4 1m 13.095s 2025-11-20 18:11:02.049 1392 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 120 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/120
node4 1m 13.096s 2025-11-20 18:11:02.050 1393 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for 120
node1 1m 13.102s 2025-11-20 18:11:02.056 1449 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+10+03.963750774Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 13.102s 2025-11-20 18:11:02.056 1450 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 93 File: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+10+03.963750774Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 13.103s 2025-11-20 18:11:02.057 1451 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 1m 13.107s 2025-11-20 18:11:02.061 1452 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 1m 13.107s 2025-11-20 18:11:02.061 1453 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 120 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/120 {"round":120,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/120/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 1m 13.144s 2025-11-20 18:11:02.098 1394 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 120 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/120
node0 1m 13.145s 2025-11-20 18:11:02.099 1395 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for 120
node4 1m 13.182s 2025-11-20 18:11:02.136 1436 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for 120
node4 1m 13.185s 2025-11-20 18:11:02.139 1437 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 120 Timestamp: 2025-11-20T18:11:00.442893Z Next consensus number: 4297 Legacy running event hash: 645651750a3eac57f8534eb19cf100c270f3378b675a8e733821ec79dbdf8ca2215ccf63a4e42bdc29eedc0bf769a467 Legacy running event mnemonic: member-concert-elephant-inside Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2112197699 Root hash: 5394730fd3e0286481b0bdce02c3818264f3eb8cc514cd7747da176c531e3ad666916858bdac6683e509d71f41aec78b (root) VirtualMap state / income-giggle-lottery-arrive {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"aim-repeat-pizza-layer"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"skull-rapid-fan-six"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"kiss-stable-absent-radio"}}}
node2 1m 13.195s 2025-11-20 18:11:02.149 1425 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for 120
node4 1m 13.196s 2025-11-20 18:11:02.150 1438 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/11/20/2025-11-20T18+10+04.145266281Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 13.196s 2025-11-20 18:11:02.150 1439 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 93 File: data/saved/preconsensus-events/4/2025/11/20/2025-11-20T18+10+04.145266281Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 13.197s 2025-11-20 18:11:02.151 1440 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 1m 13.198s 2025-11-20 18:11:02.152 1426 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 120 Timestamp: 2025-11-20T18:11:00.442893Z Next consensus number: 4297 Legacy running event hash: 645651750a3eac57f8534eb19cf100c270f3378b675a8e733821ec79dbdf8ca2215ccf63a4e42bdc29eedc0bf769a467 Legacy running event mnemonic: member-concert-elephant-inside Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2112197699 Root hash: 5394730fd3e0286481b0bdce02c3818264f3eb8cc514cd7747da176c531e3ad666916858bdac6683e509d71f41aec78b (root) VirtualMap state / income-giggle-lottery-arrive {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"aim-repeat-pizza-layer"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"skull-rapid-fan-six"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"kiss-stable-absent-radio"}}}
node4 1m 13.201s 2025-11-20 18:11:02.155 1441 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 1m 13.202s 2025-11-20 18:11:02.156 1442 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 120 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/120 {"round":120,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/120/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 1m 13.207s 2025-11-20 18:11:02.161 1427 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+10+04.273287088Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 13.208s 2025-11-20 18:11:02.162 1428 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 93 File: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+10+04.273287088Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 13.209s 2025-11-20 18:11:02.163 1429 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 1m 13.212s 2025-11-20 18:11:02.166 1430 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 1m 13.213s 2025-11-20 18:11:02.167 1431 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 120 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/120 {"round":120,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/120/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 1m 13.239s 2025-11-20 18:11:02.193 1441 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for 120
node0 1m 13.241s 2025-11-20 18:11:02.195 1442 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 120 Timestamp: 2025-11-20T18:11:00.442893Z Next consensus number: 4297 Legacy running event hash: 645651750a3eac57f8534eb19cf100c270f3378b675a8e733821ec79dbdf8ca2215ccf63a4e42bdc29eedc0bf769a467 Legacy running event mnemonic: member-concert-elephant-inside Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2112197699 Root hash: 5394730fd3e0286481b0bdce02c3818264f3eb8cc514cd7747da176c531e3ad666916858bdac6683e509d71f41aec78b (root) VirtualMap state / income-giggle-lottery-arrive {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"aim-repeat-pizza-layer"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"skull-rapid-fan-six"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"kiss-stable-absent-radio"}}}
node0 1m 13.252s 2025-11-20 18:11:02.206 1443 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+10+04.284523695Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 13.252s 2025-11-20 18:11:02.206 1444 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 93 File: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+10+04.284523695Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 13.254s 2025-11-20 18:11:02.208 1445 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 1m 13.257s 2025-11-20 18:11:02.211 1446 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 1m 13.258s 2025-11-20 18:11:02.212 1447 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 120 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/120 {"round":120,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/120/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 2m 12.107s 2025-11-20 18:12:01.061 2864 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 249 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 2m 12.248s 2025-11-20 18:12:01.202 2852 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 249 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 2m 12.270s 2025-11-20 18:12:01.224 2867 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 249 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 2m 12.315s 2025-11-20 18:12:01.269 2847 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 249 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 2m 12.351s 2025-11-20 18:12:01.305 2851 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 249 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 2m 12.418s 2025-11-20 18:12:01.372 2850 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 249 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/249
node2 2m 12.419s 2025-11-20 18:12:01.373 2851 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for 249
node0 2m 12.475s 2025-11-20 18:12:01.429 2870 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 249 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/249
node0 2m 12.476s 2025-11-20 18:12:01.430 2871 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for 249
node1 2m 12.486s 2025-11-20 18:12:01.440 2867 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 249 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/249
node1 2m 12.487s 2025-11-20 18:12:01.441 2868 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for 249
node3 2m 12.488s 2025-11-20 18:12:01.442 2855 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 249 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/249
node3 2m 12.489s 2025-11-20 18:12:01.443 2856 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for 249
node2 2m 12.511s 2025-11-20 18:12:01.465 2882 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for 249
node2 2m 12.513s 2025-11-20 18:12:01.467 2883 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 249 Timestamp: 2025-11-20T18:12:00.037398121Z Next consensus number: 9035 Legacy running event hash: a07ea89ae2a195753304ad6571df166c54d34d3fbca02655d38f8318386e89205233f09d9b87760318e739893588d611 Legacy running event mnemonic: utility-under-program-fabric Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1749498136 Root hash: e9f87593d40017b9c77ce0d28638ace3989e6b5af923b1151b01955efa3024e347d93a7ecfb90883ab282ac0de58b2d5 (root) VirtualMap state / scissors-raven-length-warm {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"heart-extra-heart-clean"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"top-income-rubber-ordinary"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"caution-story-creek-deer"}}}
node2 2m 12.521s 2025-11-20 18:12:01.475 2884 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+10+04.273287088Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 12.521s 2025-11-20 18:12:01.475 2885 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 221 File: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+10+04.273287088Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 12.521s 2025-11-20 18:12:01.475 2886 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 2m 12.528s 2025-11-20 18:12:01.482 2895 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 2m 12.528s 2025-11-20 18:12:01.482 2896 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 249 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/249 {"round":249,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/249/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 2m 12.549s 2025-11-20 18:12:01.503 2854 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 249 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/249
node4 2m 12.550s 2025-11-20 18:12:01.504 2855 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for 249
node0 2m 12.571s 2025-11-20 18:12:01.525 2902 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for 249
node0 2m 12.573s 2025-11-20 18:12:01.527 2903 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 249 Timestamp: 2025-11-20T18:12:00.037398121Z Next consensus number: 9035 Legacy running event hash: a07ea89ae2a195753304ad6571df166c54d34d3fbca02655d38f8318386e89205233f09d9b87760318e739893588d611 Legacy running event mnemonic: utility-under-program-fabric Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1749498136 Root hash: e9f87593d40017b9c77ce0d28638ace3989e6b5af923b1151b01955efa3024e347d93a7ecfb90883ab282ac0de58b2d5 (root) VirtualMap state / scissors-raven-length-warm {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"heart-extra-heart-clean"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"top-income-rubber-ordinary"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"caution-story-creek-deer"}}}
node3 2m 12.578s 2025-11-20 18:12:01.532 2887 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for 249
node1 2m 12.580s 2025-11-20 18:12:01.534 2899 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for 249
node0 2m 12.581s 2025-11-20 18:12:01.535 2904 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+10+04.284523695Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 12.581s 2025-11-20 18:12:01.535 2888 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 249 Timestamp: 2025-11-20T18:12:00.037398121Z Next consensus number: 9035 Legacy running event hash: a07ea89ae2a195753304ad6571df166c54d34d3fbca02655d38f8318386e89205233f09d9b87760318e739893588d611 Legacy running event mnemonic: utility-under-program-fabric Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1749498136 Root hash: e9f87593d40017b9c77ce0d28638ace3989e6b5af923b1151b01955efa3024e347d93a7ecfb90883ab282ac0de58b2d5 (root) VirtualMap state / scissors-raven-length-warm {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"heart-extra-heart-clean"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"top-income-rubber-ordinary"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"caution-story-creek-deer"}}}
node0 2m 12.582s 2025-11-20 18:12:01.536 2905 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 221 File: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+10+04.284523695Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 12.582s 2025-11-20 18:12:01.536 2906 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 2m 12.583s 2025-11-20 18:12:01.537 2900 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 249 Timestamp: 2025-11-20T18:12:00.037398121Z Next consensus number: 9035 Legacy running event hash: a07ea89ae2a195753304ad6571df166c54d34d3fbca02655d38f8318386e89205233f09d9b87760318e739893588d611 Legacy running event mnemonic: utility-under-program-fabric Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1749498136 Root hash: e9f87593d40017b9c77ce0d28638ace3989e6b5af923b1151b01955efa3024e347d93a7ecfb90883ab282ac0de58b2d5 (root) VirtualMap state / scissors-raven-length-warm {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"heart-extra-heart-clean"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"top-income-rubber-ordinary"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"caution-story-creek-deer"}}}
node3 2m 12.588s 2025-11-20 18:12:01.542 2889 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+10+04.243478241Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 12.588s 2025-11-20 18:12:01.542 2890 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 221 File: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+10+04.243478241Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 12.588s 2025-11-20 18:12:01.542 2891 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 2m 12.589s 2025-11-20 18:12:01.543 2907 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 2m 12.589s 2025-11-20 18:12:01.543 2908 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 249 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/249 {"round":249,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/249/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 2m 12.592s 2025-11-20 18:12:01.546 2901 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+10+03.963750774Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 12.593s 2025-11-20 18:12:01.547 2902 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 221 File: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+10+03.963750774Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 12.593s 2025-11-20 18:12:01.547 2903 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 2m 12.594s 2025-11-20 18:12:01.548 2892 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 2m 12.595s 2025-11-20 18:12:01.549 2893 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 249 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/249 {"round":249,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/249/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 2m 12.599s 2025-11-20 18:12:01.553 2904 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 2m 12.600s 2025-11-20 18:12:01.554 2905 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 249 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/249 {"round":249,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/249/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 2m 12.641s 2025-11-20 18:12:01.595 2894 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for 249
node4 2m 12.644s 2025-11-20 18:12:01.598 2895 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 249 Timestamp: 2025-11-20T18:12:00.037398121Z Next consensus number: 9035 Legacy running event hash: a07ea89ae2a195753304ad6571df166c54d34d3fbca02655d38f8318386e89205233f09d9b87760318e739893588d611 Legacy running event mnemonic: utility-under-program-fabric Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1749498136 Root hash: e9f87593d40017b9c77ce0d28638ace3989e6b5af923b1151b01955efa3024e347d93a7ecfb90883ab282ac0de58b2d5 (root) VirtualMap state / scissors-raven-length-warm {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"heart-extra-heart-clean"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"top-income-rubber-ordinary"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"caution-story-creek-deer"}}}
node4 2m 12.654s 2025-11-20 18:12:01.608 2896 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/11/20/2025-11-20T18+10+04.145266281Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 12.654s 2025-11-20 18:12:01.608 2897 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 221 File: data/saved/preconsensus-events/4/2025/11/20/2025-11-20T18+10+04.145266281Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 12.654s 2025-11-20 18:12:01.608 2898 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 2m 12.662s 2025-11-20 18:12:01.616 2899 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 2m 12.662s 2025-11-20 18:12:01.616 2900 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 249 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/249 {"round":249,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/249/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 3m 11.892s 2025-11-20 18:13:00.846 4347 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 381 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 3m 11.907s 2025-11-20 18:13:00.861 4374 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 381 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 3m 11.934s 2025-11-20 18:13:00.888 4349 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 381 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 3m 11.986s 2025-11-20 18:13:00.940 4364 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 381 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 3m 12.020s 2025-11-20 18:13:00.974 4415 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 381 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 3m 12.160s 2025-11-20 18:13:01.114 4367 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 381 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/381
node3 3m 12.160s 2025-11-20 18:13:01.114 4368 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for 381
node1 3m 12.194s 2025-11-20 18:13:01.148 4377 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 381 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/381
node1 3m 12.195s 2025-11-20 18:13:01.149 4378 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for 381
node3 3m 12.247s 2025-11-20 18:13:01.201 4407 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for 381
node3 3m 12.249s 2025-11-20 18:13:01.203 4408 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 381 Timestamp: 2025-11-20T18:13:00.018740Z Next consensus number: 13876 Legacy running event hash: 603d375dda65e4d930e983c75f8eec179264436ea50a3609cb200d48a69d5cd2f3eeefdebac26fdb8c83c934589c27e6 Legacy running event mnemonic: amateur-two-recall-gossip Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2016645397 Root hash: 6730f23a9abb835a743fbcb4ac5c6a9efd45891ae593010285842ebf0380af87aecb64527710558a3898ab300d5fb8d2 (root) VirtualMap state / toward-rhythm-post-network {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"path-relax-exercise-cream"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"hint-kiss-guard-man"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"notable-estate-upon-present"}}}
node3 3m 12.256s 2025-11-20 18:13:01.210 4409 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+10+04.243478241Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 12.257s 2025-11-20 18:13:01.211 4410 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 354 File: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+10+04.243478241Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 12.257s 2025-11-20 18:13:01.211 4411 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 3m 12.265s 2025-11-20 18:13:01.219 4418 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 381 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/381
node0 3m 12.266s 2025-11-20 18:13:01.220 4419 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for 381
node3 3m 12.267s 2025-11-20 18:13:01.221 4412 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 3m 12.268s 2025-11-20 18:13:01.222 4413 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 381 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/381 {"round":381,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/381/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 3m 12.279s 2025-11-20 18:13:01.233 4417 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for 381
node1 3m 12.281s 2025-11-20 18:13:01.235 4418 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 381 Timestamp: 2025-11-20T18:13:00.018740Z Next consensus number: 13876 Legacy running event hash: 603d375dda65e4d930e983c75f8eec179264436ea50a3609cb200d48a69d5cd2f3eeefdebac26fdb8c83c934589c27e6 Legacy running event mnemonic: amateur-two-recall-gossip Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2016645397 Root hash: 6730f23a9abb835a743fbcb4ac5c6a9efd45891ae593010285842ebf0380af87aecb64527710558a3898ab300d5fb8d2 (root) VirtualMap state / toward-rhythm-post-network {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"path-relax-exercise-cream"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"hint-kiss-guard-man"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"notable-estate-upon-present"}}}
node2 3m 12.288s 2025-11-20 18:13:01.242 4350 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 381 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/381
node2 3m 12.289s 2025-11-20 18:13:01.243 4351 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for 381
node1 3m 12.290s 2025-11-20 18:13:01.244 4419 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+10+03.963750774Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 12.290s 2025-11-20 18:13:01.244 4420 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 354 File: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+10+03.963750774Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 12.290s 2025-11-20 18:13:01.244 4421 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 3m 12.294s 2025-11-20 18:13:01.248 4362 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 381 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/381
node4 3m 12.295s 2025-11-20 18:13:01.249 4363 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for 381
node1 3m 12.300s 2025-11-20 18:13:01.254 4422 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 3m 12.301s 2025-11-20 18:13:01.255 4423 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 381 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/381 {"round":381,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/381/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 3m 12.360s 2025-11-20 18:13:01.314 4450 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for 381
node0 3m 12.362s 2025-11-20 18:13:01.316 4451 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 381 Timestamp: 2025-11-20T18:13:00.018740Z Next consensus number: 13876 Legacy running event hash: 603d375dda65e4d930e983c75f8eec179264436ea50a3609cb200d48a69d5cd2f3eeefdebac26fdb8c83c934589c27e6 Legacy running event mnemonic: amateur-two-recall-gossip Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2016645397 Root hash: 6730f23a9abb835a743fbcb4ac5c6a9efd45891ae593010285842ebf0380af87aecb64527710558a3898ab300d5fb8d2 (root) VirtualMap state / toward-rhythm-post-network {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"path-relax-exercise-cream"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"hint-kiss-guard-man"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"notable-estate-upon-present"}}}
node0 3m 12.371s 2025-11-20 18:13:01.325 4452 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+10+04.284523695Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 12.371s 2025-11-20 18:13:01.325 4453 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 354 File: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+10+04.284523695Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 12.372s 2025-11-20 18:13:01.326 4454 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 3m 12.373s 2025-11-20 18:13:01.327 4390 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for 381
node2 3m 12.374s 2025-11-20 18:13:01.328 4391 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 381 Timestamp: 2025-11-20T18:13:00.018740Z Next consensus number: 13876 Legacy running event hash: 603d375dda65e4d930e983c75f8eec179264436ea50a3609cb200d48a69d5cd2f3eeefdebac26fdb8c83c934589c27e6 Legacy running event mnemonic: amateur-two-recall-gossip Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2016645397 Root hash: 6730f23a9abb835a743fbcb4ac5c6a9efd45891ae593010285842ebf0380af87aecb64527710558a3898ab300d5fb8d2 (root) VirtualMap state / toward-rhythm-post-network {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"path-relax-exercise-cream"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"hint-kiss-guard-man"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"notable-estate-upon-present"}}}
node0 3m 12.382s 2025-11-20 18:13:01.336 4455 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 3m 12.382s 2025-11-20 18:13:01.336 4392 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+10+04.273287088Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 12.383s 2025-11-20 18:13:01.337 4456 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 381 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/381 {"round":381,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/381/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 3m 12.383s 2025-11-20 18:13:01.337 4393 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 354 File: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+10+04.273287088Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 12.383s 2025-11-20 18:13:01.337 4394 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 3m 12.392s 2025-11-20 18:13:01.346 4395 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 3m 12.393s 2025-11-20 18:13:01.347 4396 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 381 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/381 {"round":381,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/381/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 3m 12.401s 2025-11-20 18:13:01.355 4402 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for 381
node4 3m 12.403s 2025-11-20 18:13:01.357 4403 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 381 Timestamp: 2025-11-20T18:13:00.018740Z Next consensus number: 13876 Legacy running event hash: 603d375dda65e4d930e983c75f8eec179264436ea50a3609cb200d48a69d5cd2f3eeefdebac26fdb8c83c934589c27e6 Legacy running event mnemonic: amateur-two-recall-gossip Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2016645397 Root hash: 6730f23a9abb835a743fbcb4ac5c6a9efd45891ae593010285842ebf0380af87aecb64527710558a3898ab300d5fb8d2 (root) VirtualMap state / toward-rhythm-post-network {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"path-relax-exercise-cream"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"hint-kiss-guard-man"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"notable-estate-upon-present"}}}
node4 3m 12.415s 2025-11-20 18:13:01.369 4404 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/11/20/2025-11-20T18+10+04.145266281Z_seq0_minr1_maxr501_orgn0.pces
node4 3m 12.415s 2025-11-20 18:13:01.369 4405 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 354 File: data/saved/preconsensus-events/4/2025/11/20/2025-11-20T18+10+04.145266281Z_seq0_minr1_maxr501_orgn0.pces
node4 3m 12.415s 2025-11-20 18:13:01.369 4406 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 3m 12.425s 2025-11-20 18:13:01.379 4407 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 3m 12.426s 2025-11-20 18:13:01.380 4408 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 381 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/381 {"round":381,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/381/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 3m 15.180s 2025-11-20 18:13:04.134 4467 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 2 to 4>> NetworkUtils: Connection broken: 2 -> 4
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-20T18:13:04.133379917Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node0 3m 15.181s 2025-11-20 18:13:04.135 4536 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 0 to 4>> NetworkUtils: Connection broken: 0 -> 4
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-20T18:13:04.133872081Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node3 3m 15.183s 2025-11-20 18:13:04.137 4476 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 3 to 4>> NetworkUtils: Connection broken: 3 -> 4
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-20T18:13:04.134088888Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node1 3m 15.185s 2025-11-20 18:13:04.139 4490 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 1 to 4>> NetworkUtils: Connection broken: 1 -> 4
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-20T18:13:04.134365726Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node1 4m 11.927s 2025-11-20 18:14:00.881 6023 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 519 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 4m 11.935s 2025-11-20 18:14:00.889 5937 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 519 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 4m 11.969s 2025-11-20 18:14:00.923 6132 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 519 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 4m 11.982s 2025-11-20 18:14:00.936 5932 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 519 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 4m 12.118s 2025-11-20 18:14:01.072 5935 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 519 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/519
node2 4m 12.119s 2025-11-20 18:14:01.073 5936 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for 519
node1 4m 12.146s 2025-11-20 18:14:01.100 6026 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 519 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/519
node1 4m 12.147s 2025-11-20 18:14:01.101 6027 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for 519
node3 4m 12.188s 2025-11-20 18:14:01.142 5950 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 519 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/519
node3 4m 12.189s 2025-11-20 18:14:01.143 5951 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for 519
node2 4m 12.205s 2025-11-20 18:14:01.159 5975 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for 519
node2 4m 12.207s 2025-11-20 18:14:01.161 5976 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 519 Timestamp: 2025-11-20T18:14:00.016553Z Next consensus number: 17228 Legacy running event hash: ccd797942cef7196af5c76c9008950b583bce7dba6dd4257ee0ee0165973964df4565b1580c5164f48036464cdf76301 Legacy running event mnemonic: gown-assault-over-crazy Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1515013193 Root hash: 1eea72d433e62eecc0eac6f65ff4bba2ebd78b75e0b7a9f1b19ac92b01ffab32b28f76f060b22cdf175f540b6f160215 (root) VirtualMap state / tail-fold-lamp-talk {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"divide-issue-fruit-end"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"remove-nerve-aerobic-attitude"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"calm-catalog-crush-cheese"}}}
node2 4m 12.216s 2025-11-20 18:14:01.170 5977 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+13+53.164255242Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+10+04.273287088Z_seq0_minr1_maxr501_orgn0.pces
node2 4m 12.216s 2025-11-20 18:14:01.170 5978 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus event files meeting specified criteria to copy.
Lower bound: 492 First file to copy: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+10+04.273287088Z_seq0_minr1_maxr501_orgn0.pces Last file to copy: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+13+53.164255242Z_seq1_minr474_maxr5474_orgn0.pces
node2 4m 12.216s 2025-11-20 18:14:01.170 5979 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 2 preconsensus event file(s)
node2 4m 12.228s 2025-11-20 18:14:01.182 5980 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 2 preconsensus event file(s)
node2 4m 12.229s 2025-11-20 18:14:01.183 5981 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 519 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/519 {"round":519,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/519/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 4m 12.231s 2025-11-20 18:14:01.185 6058 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for 519
node1 4m 12.233s 2025-11-20 18:14:01.187 6059 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 519 Timestamp: 2025-11-20T18:14:00.016553Z Next consensus number: 17228 Legacy running event hash: ccd797942cef7196af5c76c9008950b583bce7dba6dd4257ee0ee0165973964df4565b1580c5164f48036464cdf76301 Legacy running event mnemonic: gown-assault-over-crazy Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1515013193 Root hash: 1eea72d433e62eecc0eac6f65ff4bba2ebd78b75e0b7a9f1b19ac92b01ffab32b28f76f060b22cdf175f540b6f160215 (root) VirtualMap state / tail-fold-lamp-talk {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"divide-issue-fruit-end"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"remove-nerve-aerobic-attitude"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"calm-catalog-crush-cheese"}}}
node1 4m 12.239s 2025-11-20 18:14:01.193 6060 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+10+03.963750774Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+13+53.150382113Z_seq1_minr474_maxr5474_orgn0.pces
node1 4m 12.240s 2025-11-20 18:14:01.194 6061 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus event files meeting specified criteria to copy.
Lower bound: 492 First file to copy: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+10+03.963750774Z_seq0_minr1_maxr501_orgn0.pces Last file to copy: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+13+53.150382113Z_seq1_minr474_maxr5474_orgn0.pces
node1 4m 12.240s 2025-11-20 18:14:01.194 6062 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 2 preconsensus event file(s)
node1 4m 12.252s 2025-11-20 18:14:01.206 6063 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 2 preconsensus event file(s)
node1 4m 12.252s 2025-11-20 18:14:01.206 6064 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 519 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/519 {"round":519,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/519/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 4m 12.276s 2025-11-20 18:14:01.230 5990 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for 519
node3 4m 12.278s 2025-11-20 18:14:01.232 5991 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 519 Timestamp: 2025-11-20T18:14:00.016553Z Next consensus number: 17228 Legacy running event hash: ccd797942cef7196af5c76c9008950b583bce7dba6dd4257ee0ee0165973964df4565b1580c5164f48036464cdf76301 Legacy running event mnemonic: gown-assault-over-crazy Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1515013193 Root hash: 1eea72d433e62eecc0eac6f65ff4bba2ebd78b75e0b7a9f1b19ac92b01ffab32b28f76f060b22cdf175f540b6f160215 (root) VirtualMap state / tail-fold-lamp-talk {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"divide-issue-fruit-end"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"remove-nerve-aerobic-attitude"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"calm-catalog-crush-cheese"}}}
node3 4m 12.287s 2025-11-20 18:14:01.241 5992 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+13+53.099078562Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+10+04.243478241Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 12.288s 2025-11-20 18:14:01.242 5993 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus event files meeting specified criteria to copy.
Lower bound: 492 First file to copy: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+10+04.243478241Z_seq0_minr1_maxr501_orgn0.pces Last file to copy: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+13+53.099078562Z_seq1_minr474_maxr5474_orgn0.pces
node3 4m 12.288s 2025-11-20 18:14:01.242 5994 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 2 preconsensus event file(s)
node3 4m 12.300s 2025-11-20 18:14:01.254 5995 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 2 preconsensus event file(s)
node3 4m 12.300s 2025-11-20 18:14:01.254 5996 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 519 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/519 {"round":519,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/519/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 4m 12.333s 2025-11-20 18:14:01.287 6145 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 519 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/519
node0 4m 12.334s 2025-11-20 18:14:01.288 6146 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for 519
node0 4m 12.425s 2025-11-20 18:14:01.379 6188 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for 519
node0 4m 12.428s 2025-11-20 18:14:01.382 6189 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 519 Timestamp: 2025-11-20T18:14:00.016553Z Next consensus number: 17228 Legacy running event hash: ccd797942cef7196af5c76c9008950b583bce7dba6dd4257ee0ee0165973964df4565b1580c5164f48036464cdf76301 Legacy running event mnemonic: gown-assault-over-crazy Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1515013193 Root hash: 1eea72d433e62eecc0eac6f65ff4bba2ebd78b75e0b7a9f1b19ac92b01ffab32b28f76f060b22cdf175f540b6f160215 (root) VirtualMap state / tail-fold-lamp-talk {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"divide-issue-fruit-end"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"remove-nerve-aerobic-attitude"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"calm-catalog-crush-cheese"}}}
node0 4m 12.435s 2025-11-20 18:14:01.389 6190 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+10+04.284523695Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+13+53.162136071Z_seq1_minr474_maxr5474_orgn0.pces
node0 4m 12.436s 2025-11-20 18:14:01.390 6191 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus event files meeting specified criteria to copy.
Lower bound: 492 First file to copy: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+10+04.284523695Z_seq0_minr1_maxr501_orgn0.pces Last file to copy: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+13+53.162136071Z_seq1_minr474_maxr5474_orgn0.pces
node0 4m 12.436s 2025-11-20 18:14:01.390 6192 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 2 preconsensus event file(s)
node0 4m 12.449s 2025-11-20 18:14:01.403 6193 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 2 preconsensus event file(s)
node0 4m 12.449s 2025-11-20 18:14:01.403 6194 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 519 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/519 {"round":519,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/519/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 5m 12.031s 2025-11-20 18:15:00.985 7498 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 657 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 5m 12.069s 2025-11-20 18:15:01.023 7822 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 657 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 5m 12.085s 2025-11-20 18:15:01.039 7583 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 657 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 5m 12.097s 2025-11-20 18:15:01.051 7579 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 657 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 5m 12.221s 2025-11-20 18:15:01.175 7586 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 657 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/657
node3 5m 12.222s 2025-11-20 18:15:01.176 7587 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for 657
node2 5m 12.248s 2025-11-20 18:15:01.202 7501 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 657 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/657
node2 5m 12.249s 2025-11-20 18:15:01.203 7502 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for 657
node0 5m 12.291s 2025-11-20 18:15:01.245 7825 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 657 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/657
node0 5m 12.292s 2025-11-20 18:15:01.246 7826 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for 657
node3 5m 12.310s 2025-11-20 18:15:01.264 7618 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for 657
node3 5m 12.312s 2025-11-20 18:15:01.266 7619 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 657 Timestamp: 2025-11-20T18:15:00.116474487Z Next consensus number: 20553 Legacy running event hash: 60826b8d9fe92ac217c9447ce58fe3a682631e3cba7cb8b3e1c080b089cd0e3f9bf2fcdf4b4ebf489b5a13f9976b22d1 Legacy running event mnemonic: aware-immune-actual-flame Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -13720108 Root hash: d689bccd53f09382807505ce4c8b61bdc1154c2595c6e15dce59266582da90884d7f6ecbf2e6a9880b43fe590449caf6 (root) VirtualMap state / confirm-grid-lawn-more {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"rug-purse-method-arrow"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"whip-snake-average-describe"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"bubble-own-assume-shuffle"}}}
node3 5m 12.320s 2025-11-20 18:15:01.274 7620 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+13+53.099078562Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+10+04.243478241Z_seq0_minr1_maxr501_orgn0.pces
node3 5m 12.320s 2025-11-20 18:15:01.274 7621 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 630 File: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+13+53.099078562Z_seq1_minr474_maxr5474_orgn0.pces
node3 5m 12.320s 2025-11-20 18:15:01.274 7622 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 5m 12.323s 2025-11-20 18:15:01.277 7623 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 5m 12.323s 2025-11-20 18:15:01.277 7624 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 657 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/657 {"round":657,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/657/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 5m 12.325s 2025-11-20 18:15:01.279 7582 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 657 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/657
node3 5m 12.325s 2025-11-20 18:15:01.279 7625 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1
node1 5m 12.326s 2025-11-20 18:15:01.280 7583 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for 657
node2 5m 12.329s 2025-11-20 18:15:01.283 7541 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for 657
node2 5m 12.331s 2025-11-20 18:15:01.285 7542 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 657 Timestamp: 2025-11-20T18:15:00.116474487Z Next consensus number: 20553 Legacy running event hash: 60826b8d9fe92ac217c9447ce58fe3a682631e3cba7cb8b3e1c080b089cd0e3f9bf2fcdf4b4ebf489b5a13f9976b22d1 Legacy running event mnemonic: aware-immune-actual-flame Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -13720108 Root hash: d689bccd53f09382807505ce4c8b61bdc1154c2595c6e15dce59266582da90884d7f6ecbf2e6a9880b43fe590449caf6 (root) VirtualMap state / confirm-grid-lawn-more {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"rug-purse-method-arrow"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"whip-snake-average-describe"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"bubble-own-assume-shuffle"}}}
node2 5m 12.339s 2025-11-20 18:15:01.293 7543 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+13+53.164255242Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+10+04.273287088Z_seq0_minr1_maxr501_orgn0.pces
node2 5m 12.339s 2025-11-20 18:15:01.293 7544 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 630 File: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+13+53.164255242Z_seq1_minr474_maxr5474_orgn0.pces
node2 5m 12.340s 2025-11-20 18:15:01.294 7545 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 5m 12.342s 2025-11-20 18:15:01.296 7546 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 5m 12.343s 2025-11-20 18:15:01.297 7547 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 657 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/657 {"round":657,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/657/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 5m 12.345s 2025-11-20 18:15:01.299 7548 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1
node0 5m 12.374s 2025-11-20 18:15:01.328 7857 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for 657
node0 5m 12.376s 2025-11-20 18:15:01.330 7858 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 657 Timestamp: 2025-11-20T18:15:00.116474487Z Next consensus number: 20553 Legacy running event hash: 60826b8d9fe92ac217c9447ce58fe3a682631e3cba7cb8b3e1c080b089cd0e3f9bf2fcdf4b4ebf489b5a13f9976b22d1 Legacy running event mnemonic: aware-immune-actual-flame Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -13720108 Root hash: d689bccd53f09382807505ce4c8b61bdc1154c2595c6e15dce59266582da90884d7f6ecbf2e6a9880b43fe590449caf6 (root) VirtualMap state / confirm-grid-lawn-more {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"rug-purse-method-arrow"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"whip-snake-average-describe"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"bubble-own-assume-shuffle"}}}
node0 5m 12.382s 2025-11-20 18:15:01.336 7859 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+10+04.284523695Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+13+53.162136071Z_seq1_minr474_maxr5474_orgn0.pces
node0 5m 12.382s 2025-11-20 18:15:01.336 7860 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 630 File: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+13+53.162136071Z_seq1_minr474_maxr5474_orgn0.pces
node0 5m 12.382s 2025-11-20 18:15:01.336 7861 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 5m 12.385s 2025-11-20 18:15:01.339 7862 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 5m 12.386s 2025-11-20 18:15:01.340 7863 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 657 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/657 {"round":657,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/657/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 5m 12.387s 2025-11-20 18:15:01.341 7864 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1
node1 5m 12.415s 2025-11-20 18:15:01.369 7614 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for 657
node1 5m 12.417s 2025-11-20 18:15:01.371 7615 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 657 Timestamp: 2025-11-20T18:15:00.116474487Z Next consensus number: 20553 Legacy running event hash: 60826b8d9fe92ac217c9447ce58fe3a682631e3cba7cb8b3e1c080b089cd0e3f9bf2fcdf4b4ebf489b5a13f9976b22d1 Legacy running event mnemonic: aware-immune-actual-flame Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -13720108 Root hash: d689bccd53f09382807505ce4c8b61bdc1154c2595c6e15dce59266582da90884d7f6ecbf2e6a9880b43fe590449caf6 (root) VirtualMap state / confirm-grid-lawn-more {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"rug-purse-method-arrow"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"whip-snake-average-describe"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"bubble-own-assume-shuffle"}}}
node1 5m 12.424s 2025-11-20 18:15:01.378 7616 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+10+03.963750774Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+13+53.150382113Z_seq1_minr474_maxr5474_orgn0.pces
node1 5m 12.424s 2025-11-20 18:15:01.378 7617 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 630 File: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+13+53.150382113Z_seq1_minr474_maxr5474_orgn0.pces
node1 5m 12.424s 2025-11-20 18:15:01.378 7618 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 5m 12.427s 2025-11-20 18:15:01.381 7619 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 5m 12.427s 2025-11-20 18:15:01.381 7620 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 657 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/657 {"round":657,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/657/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 5m 12.429s 2025-11-20 18:15:01.383 7621 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1
node4 5m 55.902s 2025-11-20 18:15:44.856 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 5m 55.997s 2025-11-20 18:15:44.951 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 5m 56.013s 2025-11-20 18:15:44.967 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 56.135s 2025-11-20 18:15:45.089 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 5m 56.166s 2025-11-20 18:15:45.120 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 5m 57.717s 2025-11-20 18:15:46.671 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1550ms
node4 5m 57.726s 2025-11-20 18:15:46.680 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 5m 57.729s 2025-11-20 18:15:46.683 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 57.771s 2025-11-20 18:15:46.725 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 5m 57.835s 2025-11-20 18:15:46.789 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 5m 57.836s 2025-11-20 18:15:46.790 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 5m 58.721s 2025-11-20 18:15:47.675 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 5m 58.816s 2025-11-20 18:15:47.770 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 58.824s 2025-11-20 18:15:47.778 16 INFO STARTUP <main> StartupStateUtils: The following saved states were found on disk:
- /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/381 - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/249 - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/120 - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1
node4 5m 58.824s 2025-11-20 18:15:47.778 17 INFO STARTUP <main> StartupStateUtils: Loading latest state from disk.
node4 5m 58.824s 2025-11-20 18:15:47.778 18 INFO STARTUP <main> StartupStateUtils: Loading signed state from disk: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/381
node4 5m 58.833s 2025-11-20 18:15:47.787 19 INFO STATE_TO_DISK <main> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp
node4 5m 58.956s 2025-11-20 18:15:47.910 29 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 59.812s 2025-11-20 18:15:48.766 31 INFO STARTUP <main> StartupStateUtils: Loaded state's hash is the same as when it was saved.
node4 5m 59.821s 2025-11-20 18:15:48.775 32 INFO STARTUP <main> StartupStateUtils: Platform has loaded a saved state {"round":381,"consensusTimestamp":"2025-11-20T18:13:00.018740Z"} [com.swirlds.logging.legacy.payload.SavedStateLoadedPayload]
node4 5m 59.827s 2025-11-20 18:15:48.781 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 59.828s 2025-11-20 18:15:48.782 37 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 5m 59.834s 2025-11-20 18:15:48.788 39 INFO STARTUP <main> AddressBookInitializer: Using the loaded state's address book and weight values.
node4 5m 59.845s 2025-11-20 18:15:48.799 40 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 59.848s 2025-11-20 18:15:48.802 41 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 6.016m 2025-11-20 18:15:49.892 42 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=25950444] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=275230, randomLong=-7326087772858144152, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=19391, randomLong=883761385450188109, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1261939, data=35, exception=null] OS Health Check Report - Complete (took 1025 ms)
node4 6.016m 2025-11-20 18:15:49.927 43 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 6m 1.114s 2025-11-20 18:15:50.068 44 INFO STARTUP <main> PcesUtilities: Span compaction completed for data/saved/preconsensus-events/4/2025/11/20/2025-11-20T18+10+04.145266281Z_seq0_minr1_maxr501_orgn0.pces, new upper bound is 388
node4 6m 1.118s 2025-11-20 18:15:50.072 45 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 6m 1.120s 2025-11-20 18:15:50.074 46 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 6m 1.229s 2025-11-20 18:15:50.183 47 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIdUmpLKzyXgUwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBALXCoDQ+HOVsEDTZpFuJITSaGwaKX2is5K1P/lV+G+ll6u36IdqKNnZIirJrpX2N0Ad6NeF/oFcMhietrKt818PDA9Tbb2tqcHNKTxxZAEj7amQTsrU4EsNmUhaPgMs89yj9WLxCXVzW05cQjqYEA/hymzohWs1BdU3Y2KdmELe0v5fzRgDpNgYHhUN7IrlrlgXEWpuKRskBYc4PIvyACijY0/zkeEAyHOshYYGKhQbNm/NGWhFq83ro77CZZhX3Vl7hRnHLaEoCEE8atY8R1Txhy8aObhiS6R8ZVRTkZLar/FG/xe78RQfwHHD1al2w5oHR7xgTZylhbD+nVQ09Zmi25USpvqwumbMBE0OWhV+VH1WLCHfLQs6/5yuDjeZ/0D9tpQ8pfkiEkGLedzUzQkq+4/HmN4IFTOhgJHlu1tVUqohZIPZ5zSzqkqFzFQGRo2uAX8C2EJ3qgQMAEOpH8iOjiSKsezlIPuwvmrVDPxVfpY2Cq60oxRu6B8bZdbQkfwIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQAloxwiVu7pBhkO4fLqYRw4FC0VEx+c47W4xnrq3G/uXMGwE2Mfwple9FZnfT9JgSoT1UVw+cigo4720WdrPqkK8qnA3/PzGXlfJ3k6eFcBuli/KY1TakIJUAxFt5biNKatheMwAKsbF/JyVyaqG2dbSaXQ6hZBLQTYmLrmFWMvi9QdM1S8vNVMjn0hE2qQJtnVRuVwqRaAQ225jDv2CUCT28t0EWE6ccbiRi74l8KoW1Lo3v2EQ6ZZ89Xt3CwFSQHa6YVT685ECy82qMysU+YHBe9WmwJW05UAAY7JRsOo+RuuU/r4acNLmzprG+l7qsqqPkwXTcziw9Y2OYsFgY4bTlIOV0JC0AYApctDB3gbn83LM73CWccGrXq0liSV0wL11wscH3gFohXrwb646+6hgncZiDshlZlWaFSkHQJAxTR9bsbsCwKdZpzIIVOVTOT/3oLQKCCQvPriTpJiNa0P6gB0pq64lNcyG9fL8vS3YFFnWJTZwb8ZzGK+LZ91/2Y=", "gossipEndpoint": [{ "ipAddressV4": "iHFhtA==", "port": 30124 }, { "ipAddressV4": "CoAAEQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJguXwyGFpb8MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTIwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTIwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDYXoYHBtw8adD5sxLZSnlG9XgBLWVbIDl3YA4rZZ11cgl6FG2TvF8UVNXQ177cRm1xUUJRI5ulSgDofnm7Iuf6c/GoQrud2nP1yMWewGslwiEi1h2pxbN7doFvn/92Y0lJVwSV/vOpbIyPRoMeF0jXd7TEI7dYj4S7gV9uWmQCIWjwTZqVsjIAtzEkYnmS0/m5XuD9MJsin8OQRu/PEFL8qaVPQJ2GhOhpUJqvADQ/Lsq/FHcPjylcRcnUQlFRojk2jqugtoRegByjPrAOSYGJeWUCVYmd7W51L/AkVx1rDLeHj0zLTTzQRF5G56i+S+tAcpY/uiCrwLvszFlDlD1diOuaucmu54lalrSTlVe5eOyq2ga2tKi11LQ+w09105zLyRWk7DBU93f5dTYNSmokI7b4sVRxu6SP0p/F9wND77wv2Ax5OpIWWty8zy8Y+xOuRyFu/rJ4ddDmRYvRmptM0rCAfv6hgd3m5Y/OAadQm/OuN91Uq9PIJdlMtjDbIfECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEANutmL3V1PlvlsZ6xG8Sx9cKTok3kf3rBf7D7eE8Nn8ryHi3cw9CvCaj1E6zmTTh9k23DAZVWulhjTY5GWcx5NO7QAWjKau44g/HecNNrWsD/+nIrhmAk2WxKp175CwqJaIWA7CM6VMfFktjaflUPcB6RJnHrAa8M1HUpEsBz0mFmLz7lIaDemxYCE8M8slb6wTMjpL83GB+ejudRe7YK2ZWixM+CGp0ARkV+EecHaCXgEoROUNwP6mZVJcgSVR1QBQwcGAMIrutsKENM8HR9o3LWacigoJXf+IX8c6aJhrHfFvm62q+hi3baj7iR6gebEdWPtmEXgoVWOk230fLGyPU1oBxaDdYa8V4+ZFv03O91By9tuFrwZOcLCb4CPRyr8A47lHNjRIeo2nUF/c+SjV0eBcPKCnn1nW/AQWCxJ0QzzG6tEeMAGdDrE2ujPlB+Y9Sn8vB0zjYQHTr1NKyyXNogB4y48jofLDLDGOQYI6uP2fDgZeiq4dV8w91WbPHV", "gossipEndpoint": [{ "ipAddressV4": "IkW49A==", "port": 30125 }, { "ipAddressV4": "CoAAGA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJwswl59m488MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDAqlNMpfduuW0ETQVjdKf5ZBe3Ug/ybRMoCWIlue8UoxFzamAtoeFEW3GVi862iImRVyHbkBZzDQUw4ABwMdxfzTL9voozkMaOZb4KQ9yZ9zNLAAmSSuE6RFmSJnBtfufxFXqiu6esbcvyropjZLc65F2uoMCpKN0CHFpWEb2GZAaipp7WCOon0NllDLqkjPylluXO4mjbzzMSDPbBWRD8VjjkxZeszWSXYxz9hqcRYX01CGg+jhooCQ6j2yB8sfFAffIeTG6GSV1uCFa4san2emhQWpr+cHaVYJMtejL43HaEVQnF3vh5Z10T/7co63C63aay2hs6Bx5SschosyYiafI7GtbQ4qpOgjEDFT1jlydK21gy6MV3SFEYwcUfxvxxRj6pS7xiMFn4FYnBKPJWkaDkwTqboEshxstvASQOW993uEwzh4EjctRHSjSuTU6S9OsWi5I5cRF+xK6GaWsTp0KyO8uVpuM9kZfpOcor294quyKJ9nylNyIt/m8Q8/ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAqPLB/xr0Yv1l9w/RO+bqFtl8TkxF/6jOqoEUXY06dEInopLYpmkksZZ9G8vebt6hAoLjaxNMdRqCkzKgy4jn7/SQZNV9FMbZ7ckiDxsBxYZ2ZaBootuWzzVD6hCSO3Tg6JgkIzldtFtNcDVBRgZnHg+Rl6hn+gFV5S2OTTTPHWK7GHwgHXLhK7N0RL4YVrRCi/HTUZnuYCjBwvdDte5iqytY05cAO4p72P6YtDaOdAfL/IIKd1ylCWITDqTp/JDBz1uxjQmsXLVD/KEEtlvYlGjIr+wUUqIUPhFvB6ajl2NO0D/r+t1BH454zbodU92QnOJpXpoNuOv7jjALHCqo70mCSwTNUSZuVP6/KLmQe8sSzYs7O/c25FzHKBYy+aZujoa/X7aI6XVmsUkj6ae9MSvQurk0jMNg/Jy5EtWOMy7WEuyadrAv6KSP3oIfmL9jWoPcyOMfvjRHxGqOfZuFZatAwswY6O0E3ATTrN03t/BVqNHIYIXc6UOiUTo2Nx56", "gossipEndpoint": [{ "ipAddressV4": "IhBJhQ==", "port": 30126 }, { "ipAddressV4": "CoAALA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAOxH0o7YkAUoMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDf6+SJl+puqRNd5r2Tb802jQTqPm7k3NXIeU8NQ3Hy9p0G+9p4Hgnt3ftipar7lKPKnp4PFrOP7E7XSKpafxK2OVQ0jTMvc6Yjqt+9mzyNSI1I8cSHTmhJ7kMBt0+NwVM8QN+fbKcbQaoNiPwMcckVtGeMad4aZM6hRyxzI0H3wgMj4JiM9VRwx7JbEo3R7akRwLwGr9ZQm2EQwqiyReNkBnXrsyP4KPPVAoeMfGchoAuBbV+r6v1OeYddocYmZkrsvMXUKF/uEcgd8gTu+pv3jObwIEVqXo1yC6ZlCFqO7LIvT8jTAAljkszoo67ykXTbKS0PZeLDg6nvdPvBMQ50yjfswR88S6N8VU6pud7Y+VbMYUiGzlrFi4MB9dikAjEj4PEetQyZdn84ZXGxerXlU/vTO2Fp4i1ec5rmX1P0WYMlbNELE408j5nfCfzD/qdcF5HZAiUVTYU/SWpzWcn34++KGpuqZZQdsGwCLQWeMeA/OEemYChis4cO94aOzrECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAlj5YIsbYXk2JGP9kRCBLDgz27ymYi1KDbO8g18V4T0zj2Zl7858U7mF9UBSSW+Cjl1UtUdvqFWZhh8jRoO3Jov1QGTULHRfyyPElD4VpwFribiu4GYJaodYy6NE50WwSJf32gLG0jHQWt7q+cOrn6WaG2h8O1sIxbTlnu1kqKQUQtu4oX8u23b5m9QXVJfJVdecwD5Rmab2d3dq/NNv2iNELH0myqtcoqw26xwIvXwaS4Gqi+Y0cOfjWL5Gv5AHIwvBXGIh3KUU7pbyBzqjkigbzSeoZw0C8G2cRTl0+QTuet2SVYlFh5J9/FBLvIfMfIpguglaU6xTVoRpo7RF24qQKFt2IlBROpqcwl0FyfE+2c19FGt1V8E5dYqE4T2mHT6FSOI3DckA2afBm1OCeMNtkqCQT8x+JvdKrgUh44QDm4PIVZDzaxog/zOzRWPCgpCPq0HcNMzgCVFt+4q8eTL9Ju/rQcS9bDosjMA69NGLIOCdPW2i/gkS9x9rTXgyp", "gossipEndpoint": [{ "ipAddressV4": "IoiZJQ==", "port": 30127 }, { "ipAddressV4": "CoAAKQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIXlngkVEv6iMwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAL4o3FK8th1cG+FSlw4iT9FlkwK+hOj4Ay6Z70mZlsNwszgxvddUEO4BEdA1iSWfxkYOLl4QwwPr3l394a07VfB5OK3dqJ6CjVdByyvzghtk3gOpkskWlJxp6vah7BbIJFWE8off7fhCdwAGSrwIRdGE8u8GbKJIdHk6/XyjB3j0BXTIgeaPTJxLeuz/2l/dQVRMXyZNxlc5UVQYnX9haMRk7M5bkb9uwfYPRikEJFp6G72x7M7Q9lBGJ3ArCQn/lPJfHSg01GxfDhWH8DOwLaFdv1bCs2zHTn7R7Wq9ymXvkUsZhlYO4mLR8HKDcM3sCrJa2rg8vgnIoZupHABKxkgtT2wxV7fM5f2oiz0mDYDTRJpgmK1lmNANj2tKnGqeDnsW7Q3zwufgZZhbks8+8uigyOyKNbp6D7Vv5KeYRibjr/xh+yWT0v02dtpBIdhqDa5CUVD9fCwigZj3PQc8N4e47ZL6s1pXpQ6Cf0lB0fSsvyhnGRa8HMx2q5eg5j/lCQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQCr9yUzOoi0xhoDE1mqR3FR/iVCq9PaBUURWL743LDMrlEvpzKX0upcwwwdgJFjVqVUywh6rKeHQt4O4UV6FIbpp0PSjSE7XZSK3UNqnhZJhQ3aNrOP+6wBhm2B0ZjrxyMS1EWeD9tcNkdYluO00RlieAEV4zwoAfeFPSB21iXW5dU8idhNuTLptDc7SJoErxN+44jvcrSe/ZhpQohG6WfyDPH0BE1tyzsiD29PAWKkrfhg5kzjTAP/qFp+ByazeltP9/F0NXI5AHbE0pKYr56XUlwDfDZOTU9b1YeS7kKyPvccvC2j9NjGGM7NjafdFLHUTYBZiNUTZXVstddYtTCVbTqI7I/x6hoeeNVDZv7XluwZLrYsDNsNrWU3c9VijPK1CE5Owy+gJoGgxEHfA/n9Jvc3lEesqKBpW92RazkpHW2eD9wh8Ayv3q6PNDGzWyiXA8YWW6yD/dIp2Oh8szZUfOXy8sQ8VW86T6RsqGP5CKKPGW1NnP/KTKe5/WoBLZQ=", "gossipEndpoint": [{ "ipAddressV4": "Ink6nQ==", "port": 30128 }, { "ipAddressV4": "CoAAJQ==", "port": 30128 }] }] }
node4 6m 1.257s 2025-11-20 18:15:50.211 48 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with state long 3112695353987981019.
node4 6m 1.258s 2025-11-20 18:15:50.212 49 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with 381 rounds handled.
node4 6m 1.258s 2025-11-20 18:15:50.212 50 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6m 1.259s 2025-11-20 18:15:50.213 51 INFO STARTUP <main> TransactionHandlingHistory: Log file found. Parsing previous history
node4 6m 1.310s 2025-11-20 18:15:50.264 52 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 381 Timestamp: 2025-11-20T18:13:00.018740Z Next consensus number: 13876 Legacy running event hash: 603d375dda65e4d930e983c75f8eec179264436ea50a3609cb200d48a69d5cd2f3eeefdebac26fdb8c83c934589c27e6 Legacy running event mnemonic: amateur-two-recall-gossip Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2016645397 Root hash: 6730f23a9abb835a743fbcb4ac5c6a9efd45891ae593010285842ebf0380af87aecb64527710558a3898ab300d5fb8d2 (root) VirtualMap state / toward-rhythm-post-network {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"path-relax-exercise-cream"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"hint-kiss-guard-man"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"notable-estate-upon-present"}}}
node4 6m 1.316s 2025-11-20 18:15:50.270 54 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node4 6m 1.541s 2025-11-20 18:15:50.495 55 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 603d375dda65e4d930e983c75f8eec179264436ea50a3609cb200d48a69d5cd2f3eeefdebac26fdb8c83c934589c27e6
node4 6m 1.552s 2025-11-20 18:15:50.506 56 INFO STARTUP <platformForkJoinThread-4> Shadowgraph: Shadowgraph starting from expiration threshold 354
node4 6m 1.561s 2025-11-20 18:15:50.515 58 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 6m 1.561s 2025-11-20 18:15:50.515 59 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 6m 1.563s 2025-11-20 18:15:50.517 60 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 6m 1.567s 2025-11-20 18:15:50.521 61 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 6m 1.569s 2025-11-20 18:15:50.523 62 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 6m 1.570s 2025-11-20 18:15:50.524 63 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 6m 1.572s 2025-11-20 18:15:50.526 64 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 354
node4 6m 1.578s 2025-11-20 18:15:50.532 65 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 197.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 6m 1.850s 2025-11-20 18:15:50.804 66 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:4 H:fc55620dcd5f BR:379), num remaining: 4
node4 6m 1.852s 2025-11-20 18:15:50.806 67 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:633da155d1a7 BR:379), num remaining: 3
node4 6m 1.852s 2025-11-20 18:15:50.806 68 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:50411963bce5 BR:380), num remaining: 2
node4 6m 1.853s 2025-11-20 18:15:50.807 69 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:b1846dd28275 BR:379), num remaining: 1
node4 6m 1.853s 2025-11-20 18:15:50.807 70 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:972bf56d1044 BR:379), num remaining: 0
node4 6m 1.919s 2025-11-20 18:15:50.873 99 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 1,221 preconsensus events with max birth round 388. These events contained 1,697 transactions. 6 rounds reached consensus spanning 2.6 seconds of consensus time. The latest round to reach consensus is round 387. Replay took 345.0 milliseconds.
node4 6m 1.922s 2025-11-20 18:15:50.876 101 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 6m 1.923s 2025-11-20 18:15:50.877 102 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 344.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 6m 2.865s 2025-11-20 18:15:51.819 137 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Preparing for reconnect, stopping gossip
node4 6m 2.865s 2025-11-20 18:15:51.819 138 INFO RECONNECT <<platform-core: SyncProtocolWith3 4 to 3>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=387,newEventBirthRound=388,ancientThreshold=358,expiredThreshold=354] remote ev=EventWindow[latestConsensusRound=773,newEventBirthRound=774,ancientThreshold=746,expiredThreshold=672]
node4 6m 2.865s 2025-11-20 18:15:51.819 139 INFO RECONNECT <<platform-core: SyncProtocolWith1 4 to 1>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=387,newEventBirthRound=388,ancientThreshold=358,expiredThreshold=354] remote ev=EventWindow[latestConsensusRound=773,newEventBirthRound=774,ancientThreshold=746,expiredThreshold=672]
node4 6m 2.865s 2025-11-20 18:15:51.819 141 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=387,newEventBirthRound=388,ancientThreshold=358,expiredThreshold=354] remote ev=EventWindow[latestConsensusRound=773,newEventBirthRound=774,ancientThreshold=746,expiredThreshold=672]
node4 6m 2.865s 2025-11-20 18:15:51.819 140 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=387,newEventBirthRound=388,ancientThreshold=358,expiredThreshold=354] remote ev=EventWindow[latestConsensusRound=773,newEventBirthRound=774,ancientThreshold=746,expiredThreshold=672]
node4 6m 2.866s 2025-11-20 18:15:51.820 142 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 941.0 ms in OBSERVING. Now in BEHIND
node4 6m 2.866s 2025-11-20 18:15:51.820 143 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Preparing for reconnect, start clearing queues
node0 6m 2.934s 2025-11-20 18:15:51.888 9160 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=773,newEventBirthRound=774,ancientThreshold=746,expiredThreshold=672] remote ev=EventWindow[latestConsensusRound=387,newEventBirthRound=388,ancientThreshold=358,expiredThreshold=354]
node2 6m 2.934s 2025-11-20 18:15:51.888 8832 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=773,newEventBirthRound=774,ancientThreshold=746,expiredThreshold=672] remote ev=EventWindow[latestConsensusRound=387,newEventBirthRound=388,ancientThreshold=358,expiredThreshold=354]
node1 6m 2.935s 2025-11-20 18:15:51.889 8937 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=773,newEventBirthRound=774,ancientThreshold=746,expiredThreshold=672] remote ev=EventWindow[latestConsensusRound=387,newEventBirthRound=388,ancientThreshold=358,expiredThreshold=354]
node3 6m 2.935s 2025-11-20 18:15:51.889 8913 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=773,newEventBirthRound=774,ancientThreshold=746,expiredThreshold=672] remote ev=EventWindow[latestConsensusRound=387,newEventBirthRound=388,ancientThreshold=358,expiredThreshold=354]
node4 6m 3.019s 2025-11-20 18:15:51.973 144 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Queues have been cleared
node4 6m 3.020s 2025-11-20 18:15:51.974 145 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Waiting for a state to be obtained from a peer
node2 6m 3.113s 2025-11-20 18:15:52.067 8833 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectStateTeacher: Starting reconnect in the role of the sender {"receiving":false,"nodeId":2,"otherNodeId":4,"round":774} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node2 6m 3.115s 2025-11-20 18:15:52.069 8834 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectStateTeacher: The following state will be sent to the learner:
Round: 774 Timestamp: 2025-11-20T18:15:50.932826Z Next consensus number: 23325 Legacy running event hash: 2920213bfff113b486e65a8ad2f67cdae5b47c322604a70c9f0badf201a123ce8c824a7e238305985101cf8d21a969bc Legacy running event mnemonic: catch-cherry-acid-isolate Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1898516850 Root hash: 7575ced77db3e20ad37de8af3e5f543ed451d4d460e5ae1ad8aa7f30e2b7dd26091c5ea1e1265370923c164a974a4c6c (root) VirtualMap state / push-cement-reopen-salute {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"winner-exhibit-rely-picnic"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"slice-pottery-film-garment"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"labor-tomorrow-summer-smoke"}}}
node2 6m 3.115s 2025-11-20 18:15:52.069 8835 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectStateTeacher: Sending signatures from nodes 1, 2, 3 (signing weight = 37500000000/50000000000) for state hash 7575ced77db3e20ad37de8af3e5f543ed451d4d460e5ae1ad8aa7f30e2b7dd26091c5ea1e1265370923c164a974a4c6c
node2 6m 3.116s 2025-11-20 18:15:52.070 8836 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectStateTeacher: Starting synchronization in the role of the sender.
node4 6m 3.183s 2025-11-20 18:15:52.137 146 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> ReconnectStatePeerProtocol: Starting reconnect in role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":2,"round":387} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node4 6m 3.184s 2025-11-20 18:15:52.138 147 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> ReconnectStateLearner: Receiving signed state signatures
node4 6m 3.186s 2025-11-20 18:15:52.140 148 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> ReconnectStateLearner: Received signatures from nodes 1, 2, 3
node2 6m 3.238s 2025-11-20 18:15:52.192 8858 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.virtualmap.VirtualMap with route []
node2 6m 3.246s 2025-11-20 18:15:52.200 8859 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@29ca06d1 start run()
node4 6m 3.394s 2025-11-20 18:15:52.348 175 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: learner calls receiveTree()
node4 6m 3.394s 2025-11-20 18:15:52.348 176 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: synchronizing tree
node4 6m 3.395s 2025-11-20 18:15:52.349 177 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: receiving tree rooted at com.swirlds.virtualmap.VirtualMap with route []
node4 6m 3.405s 2025-11-20 18:15:52.359 178 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@e2716d8 start run()
node4 6m 3.461s 2025-11-20 18:15:52.415 179 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): firstLeafPath: 4 -> 4, lastLeafPath: 8 -> 8
node4 6m 3.462s 2025-11-20 18:15:52.416 180 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): done
node4 6m 3.620s 2025-11-20 18:15:52.574 181 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 6m 3.621s 2025-11-20 18:15:52.575 182 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call nodeRemover.allNodesReceived()
node4 6m 3.621s 2025-11-20 18:15:52.575 183 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived()
node4 6m 3.621s 2025-11-20 18:15:52.575 184 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived(): done
node4 6m 3.622s 2025-11-20 18:15:52.576 185 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call root.endLearnerReconnect()
node4 6m 3.622s 2025-11-20 18:15:52.576 186 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call reconnectIterator.close()
node4 6m 3.622s 2025-11-20 18:15:52.576 187 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call setHashPrivate()
node4 6m 3.644s 2025-11-20 18:15:52.598 197 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call postInit()
node4 6m 3.644s 2025-11-20 18:15:52.598 199 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: endLearnerReconnect() complete
node4 6m 3.644s 2025-11-20 18:15:52.598 200 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: close() complete
node4 6m 3.645s 2025-11-20 18:15:52.599 201 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 6m 3.646s 2025-11-20 18:15:52.600 202 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@e2716d8 finish run()
node4 6m 3.646s 2025-11-20 18:15:52.600 203 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: received tree rooted at com.swirlds.virtualmap.VirtualMap with route []
node4 6m 3.647s 2025-11-20 18:15:52.601 204 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: synchronization complete
node4 6m 3.647s 2025-11-20 18:15:52.601 205 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: learner calls initialize()
node4 6m 3.647s 2025-11-20 18:15:52.601 206 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: initializing tree
node4 6m 3.648s 2025-11-20 18:15:52.602 207 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: initialization complete
node4 6m 3.648s 2025-11-20 18:15:52.602 208 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: learner calls hash()
node4 6m 3.648s 2025-11-20 18:15:52.602 209 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: hashing tree
node4 6m 3.648s 2025-11-20 18:15:52.602 210 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: hashing complete
node4 6m 3.648s 2025-11-20 18:15:52.602 211 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: learner calls logStatistics()
node4 6m 3.651s 2025-11-20 18:15:52.605 212 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: Finished synchronization {"timeInSeconds":0.253,"hashTimeInSeconds":0.0,"initializationTimeInSeconds":0.0,"totalNodes":9,"leafNodes":5,"redundantLeafNodes":2,"internalNodes":4,"redundantInternalNodes":0} [com.swirlds.logging.legacy.payload.SynchronizationCompletePayload]
node4 6m 3.651s 2025-11-20 18:15:52.605 213 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: ReconnectMapMetrics: transfersFromTeacher=9; transfersFromLearner=8; internalHashes=3; internalCleanHashes=0; internalData=0; internalCleanData=0; leafHashes=5; leafCleanHashes=2; leafData=5; leafCleanData=2
node4 6m 3.652s 2025-11-20 18:15:52.606 214 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: learner is done synchronizing
node4 6m 3.653s 2025-11-20 18:15:52.607 215 INFO STARTUP <<platform-core: SyncProtocolWith2 4 to 2>> ConsistencyTestingToolState: New State Constructed.
node4 6m 3.657s 2025-11-20 18:15:52.611 216 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> ReconnectStateLearner: Reconnect data usage report {"dataMegabytes":0.0058650970458984375} [com.swirlds.logging.legacy.payload.ReconnectDataUsagePayload]
node2 6m 3.677s 2025-11-20 18:15:52.631 8863 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@29ca06d1 finish run()
node2 6m 3.678s 2025-11-20 18:15:52.632 8864 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> TeachingSynchronizer: finished sending tree
node2 6m 3.680s 2025-11-20 18:15:52.634 8867 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectStateTeacher: Finished synchronization in the role of the sender.
node2 6m 3.730s 2025-11-20 18:15:52.684 8868 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectStateTeacher: Finished reconnect in the role of the sender. {"receiving":false,"nodeId":2,"otherNodeId":4,"round":774} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6m 3.754s 2025-11-20 18:15:52.708 217 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> ReconnectStatePeerProtocol: Finished reconnect in the role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":2,"round":774} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6m 3.755s 2025-11-20 18:15:52.709 218 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> ReconnectStatePeerProtocol: Information for state received during reconnect:
Round: 774 Timestamp: 2025-11-20T18:15:50.932826Z Next consensus number: 23325 Legacy running event hash: 2920213bfff113b486e65a8ad2f67cdae5b47c322604a70c9f0badf201a123ce8c824a7e238305985101cf8d21a969bc Legacy running event mnemonic: catch-cherry-acid-isolate Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1898516850 Root hash: 7575ced77db3e20ad37de8af3e5f543ed451d4d460e5ae1ad8aa7f30e2b7dd26091c5ea1e1265370923c164a974a4c6c (root) VirtualMap state / push-cement-reopen-salute {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"labor-tomorrow-summer-smoke"}}}
node4 6m 3.756s 2025-11-20 18:15:52.710 219 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: A state was obtained from a peer
node4 6m 3.757s 2025-11-20 18:15:52.711 220 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: The state obtained from a peer was validated
node4 6m 3.758s 2025-11-20 18:15:52.712 222 DEBUG RECONNECT <<platform-core: reconnectController>> ReconnectController: `loadState` : reloading state
node4 6m 3.758s 2025-11-20 18:15:52.712 223 INFO STARTUP <<platform-core: reconnectController>> ConsistencyTestingToolState: State initialized with state long -8041205640724462036.
node4 6m 3.759s 2025-11-20 18:15:52.713 224 INFO STARTUP <<platform-core: reconnectController>> ConsistencyTestingToolState: State initialized with 774 rounds handled.
node4 6m 3.759s 2025-11-20 18:15:52.713 225 INFO STARTUP <<platform-core: reconnectController>> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6m 3.759s 2025-11-20 18:15:52.713 226 INFO STARTUP <<platform-core: reconnectController>> TransactionHandlingHistory: Log file found. Parsing previous history
node4 6m 3.775s 2025-11-20 18:15:52.729 231 INFO STATE_TO_DISK <<platform-core: reconnectController>> DefaultSavedStateController: Signed state from round 774 created, will eventually be written to disk, for reason: RECONNECT
node4 6m 3.776s 2025-11-20 18:15:52.730 232 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 908.0 ms in BEHIND. Now in RECONNECT_COMPLETE
node4 6m 3.776s 2025-11-20 18:15:52.730 233 INFO STARTUP <platformForkJoinThread-5> Shadowgraph: Shadowgraph starting from expiration threshold 747
node4 6m 3.778s 2025-11-20 18:15:52.732 236 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 774 state to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/774
node4 6m 3.779s 2025-11-20 18:15:52.733 237 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/3 for 774
node4 6m 3.787s 2025-11-20 18:15:52.741 244 INFO EVENT_STREAM <<platform-core: reconnectController>> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 2920213bfff113b486e65a8ad2f67cdae5b47c322604a70c9f0badf201a123ce8c824a7e238305985101cf8d21a969bc
node4 6m 3.788s 2025-11-20 18:15:52.742 245 INFO STARTUP <platformForkJoinThread-1> PcesFileManager: Due to recent operations on this node, the local preconsensus event stream will have a discontinuity. The last file with the old origin round is 2025-11-20T18+10+04.145266281Z_seq0_minr1_maxr388_orgn0.pces. All future files will have an origin round of 774.
node4 6m 3.789s 2025-11-20 18:15:52.743 246 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Reconnect almost done resuming gossip
node4 6m 3.942s 2025-11-20 18:15:52.896 272 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/3 for 774
node4 6m 3.946s 2025-11-20 18:15:52.900 273 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 774 Timestamp: 2025-11-20T18:15:50.932826Z Next consensus number: 23325 Legacy running event hash: 2920213bfff113b486e65a8ad2f67cdae5b47c322604a70c9f0badf201a123ce8c824a7e238305985101cf8d21a969bc Legacy running event mnemonic: catch-cherry-acid-isolate Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1898516850 Root hash: 7575ced77db3e20ad37de8af3e5f543ed451d4d460e5ae1ad8aa7f30e2b7dd26091c5ea1e1265370923c164a974a4c6c (root) VirtualMap state / push-cement-reopen-salute {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"winner-exhibit-rely-picnic"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"slice-pottery-film-garment"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"labor-tomorrow-summer-smoke"}}}
node4 6m 3.982s 2025-11-20 18:15:52.936 277 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/11/20/2025-11-20T18+10+04.145266281Z_seq0_minr1_maxr388_orgn0.pces
node4 6m 3.983s 2025-11-20 18:15:52.937 278 WARN STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: No preconsensus event files meeting specified criteria found to copy. Lower bound: 747
node4 6m 3.989s 2025-11-20 18:15:52.943 279 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 774 to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/774 {"round":774,"freezeState":false,"reason":"RECONNECT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/774/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 3.992s 2025-11-20 18:15:52.946 280 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 215.0 ms in RECONNECT_COMPLETE. Now in CHECKING
node4 6m 4.572s 2025-11-20 18:15:53.526 281 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 6m 4.575s 2025-11-20 18:15:53.529 282 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 6m 4.933s 2025-11-20 18:15:53.887 283 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:5d2346aeec7b BR:772), num remaining: 3
node4 6m 4.934s 2025-11-20 18:15:53.888 284 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:c4bd14d13323 BR:772), num remaining: 2
node4 6m 4.934s 2025-11-20 18:15:53.888 285 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:8ea287416196 BR:772), num remaining: 1
node4 6m 4.935s 2025-11-20 18:15:53.889 286 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:7fd77aec73cf BR:772), num remaining: 0
node4 6m 9.018s 2025-11-20 18:15:57.972 412 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 5.0 s in CHECKING. Now in ACTIVE
node0 6m 12.638s 2025-11-20 18:16:01.592 9386 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 795 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 6m 12.647s 2025-11-20 18:16:01.601 9163 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 795 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 6m 12.737s 2025-11-20 18:16:01.691 9139 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 795 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 6m 12.846s 2025-11-20 18:16:01.800 9079 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 795 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 6m 12.849s 2025-11-20 18:16:01.803 489 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 795 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 6m 12.911s 2025-11-20 18:16:01.865 9082 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 795 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/795
node2 6m 12.911s 2025-11-20 18:16:01.865 9083 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/45 for 795
node4 6m 12.918s 2025-11-20 18:16:01.872 491 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 795 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/795
node4 6m 12.919s 2025-11-20 18:16:01.873 492 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/10 for 795
node1 6m 12.953s 2025-11-20 18:16:01.907 9166 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 795 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/795
node1 6m 12.954s 2025-11-20 18:16:01.908 9167 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for 795
node2 6m 12.990s 2025-11-20 18:16:01.944 9122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/45 for 795
node2 6m 12.992s 2025-11-20 18:16:01.946 9123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 795 Timestamp: 2025-11-20T18:16:00.229789Z Next consensus number: 23930 Legacy running event hash: 7da8c64db49a702c8ff23347783f361906d8a45b631a4089b161eb6f858226d84cad3b4b81461e1be17ce1b664e64598 Legacy running event mnemonic: relax-cushion-holiday-airport Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 271891771 Root hash: 22a5427ce7576dd0e6bdb8cc33d23cdf056cad23ecc5fc585053e05cb6b0fafd5786d6ba60a445aef9f496e5a9fcbea5 (root) VirtualMap state / feed-exhibit-benefit-ring {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"connect-whip-stamp-above"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"idle-security-guess-torch"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"vague-palace-outside-ahead"}}}
node2 6m 12.998s 2025-11-20 18:16:01.952 9124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+13+53.164255242Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+10+04.273287088Z_seq0_minr1_maxr501_orgn0.pces
node2 6m 12.999s 2025-11-20 18:16:01.953 9125 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 768 File: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+13+53.164255242Z_seq1_minr474_maxr5474_orgn0.pces
node2 6m 12.999s 2025-11-20 18:16:01.953 9126 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 6m 13.007s 2025-11-20 18:16:01.961 9127 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 6m 13.007s 2025-11-20 18:16:01.961 9128 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 795 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/795 {"round":795,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/795/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 6m 13.009s 2025-11-20 18:16:01.963 9129 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/120
node3 6m 13.024s 2025-11-20 18:16:01.978 9142 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 795 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/795
node3 6m 13.025s 2025-11-20 18:16:01.979 9143 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for 795
node4 6m 13.029s 2025-11-20 18:16:01.983 537 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/10 for 795
node4 6m 13.031s 2025-11-20 18:16:01.985 538 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 795 Timestamp: 2025-11-20T18:16:00.229789Z Next consensus number: 23930 Legacy running event hash: 7da8c64db49a702c8ff23347783f361906d8a45b631a4089b161eb6f858226d84cad3b4b81461e1be17ce1b664e64598 Legacy running event mnemonic: relax-cushion-holiday-airport Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 271891771 Root hash: 22a5427ce7576dd0e6bdb8cc33d23cdf056cad23ecc5fc585053e05cb6b0fafd5786d6ba60a445aef9f496e5a9fcbea5 (root) VirtualMap state / feed-exhibit-benefit-ring {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"connect-whip-stamp-above"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"idle-security-guess-torch"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"vague-palace-outside-ahead"}}}
node1 6m 13.037s 2025-11-20 18:16:01.991 9198 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for 795
node4 6m 13.039s 2025-11-20 18:16:01.993 539 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/11/20/2025-11-20T18+10+04.145266281Z_seq0_minr1_maxr388_orgn0.pces Last file: data/saved/preconsensus-events/4/2025/11/20/2025-11-20T18+15+53.319104201Z_seq1_minr747_maxr1247_orgn774.pces
node1 6m 13.040s 2025-11-20 18:16:01.994 9199 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 795 Timestamp: 2025-11-20T18:16:00.229789Z Next consensus number: 23930 Legacy running event hash: 7da8c64db49a702c8ff23347783f361906d8a45b631a4089b161eb6f858226d84cad3b4b81461e1be17ce1b664e64598 Legacy running event mnemonic: relax-cushion-holiday-airport Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 271891771 Root hash: 22a5427ce7576dd0e6bdb8cc33d23cdf056cad23ecc5fc585053e05cb6b0fafd5786d6ba60a445aef9f496e5a9fcbea5 (root) VirtualMap state / feed-exhibit-benefit-ring {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"connect-whip-stamp-above"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"idle-security-guess-torch"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"vague-palace-outside-ahead"}}}
node4 6m 13.040s 2025-11-20 18:16:01.994 540 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 768 File: data/saved/preconsensus-events/4/2025/11/20/2025-11-20T18+15+53.319104201Z_seq1_minr747_maxr1247_orgn774.pces
node4 6m 13.040s 2025-11-20 18:16:01.994 541 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 6m 13.042s 2025-11-20 18:16:01.996 542 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 6m 13.042s 2025-11-20 18:16:01.996 543 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 795 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/795 {"round":795,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/795/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 13.044s 2025-11-20 18:16:01.998 544 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1
node1 6m 13.047s 2025-11-20 18:16:02.001 9200 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+10+03.963750774Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+13+53.150382113Z_seq1_minr474_maxr5474_orgn0.pces
node1 6m 13.047s 2025-11-20 18:16:02.001 9201 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 768 File: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+13+53.150382113Z_seq1_minr474_maxr5474_orgn0.pces
node1 6m 13.047s 2025-11-20 18:16:02.001 9202 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 6m 13.053s 2025-11-20 18:16:02.007 9203 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 6m 13.053s 2025-11-20 18:16:02.007 9204 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 795 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/795 {"round":795,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/795/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 6m 13.055s 2025-11-20 18:16:02.009 9205 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/120
node0 6m 13.063s 2025-11-20 18:16:02.017 9389 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 795 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/795
node0 6m 13.064s 2025-11-20 18:16:02.018 9390 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for 795
node3 6m 13.113s 2025-11-20 18:16:02.067 9174 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for 795
node3 6m 13.115s 2025-11-20 18:16:02.069 9175 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 795 Timestamp: 2025-11-20T18:16:00.229789Z Next consensus number: 23930 Legacy running event hash: 7da8c64db49a702c8ff23347783f361906d8a45b631a4089b161eb6f858226d84cad3b4b81461e1be17ce1b664e64598 Legacy running event mnemonic: relax-cushion-holiday-airport Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 271891771 Root hash: 22a5427ce7576dd0e6bdb8cc33d23cdf056cad23ecc5fc585053e05cb6b0fafd5786d6ba60a445aef9f496e5a9fcbea5 (root) VirtualMap state / feed-exhibit-benefit-ring {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"connect-whip-stamp-above"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"idle-security-guess-torch"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"vague-palace-outside-ahead"}}}
node3 6m 13.123s 2025-11-20 18:16:02.077 9176 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+13+53.099078562Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+10+04.243478241Z_seq0_minr1_maxr501_orgn0.pces
node3 6m 13.123s 2025-11-20 18:16:02.077 9177 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 768 File: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+13+53.099078562Z_seq1_minr474_maxr5474_orgn0.pces
node3 6m 13.123s 2025-11-20 18:16:02.077 9178 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 6m 13.128s 2025-11-20 18:16:02.082 9179 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 6m 13.129s 2025-11-20 18:16:02.083 9180 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 795 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/795 {"round":795,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/795/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 6m 13.131s 2025-11-20 18:16:02.085 9181 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/120
node0 6m 13.149s 2025-11-20 18:16:02.103 9427 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for 795
node0 6m 13.151s 2025-11-20 18:16:02.105 9428 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 795 Timestamp: 2025-11-20T18:16:00.229789Z Next consensus number: 23930 Legacy running event hash: 7da8c64db49a702c8ff23347783f361906d8a45b631a4089b161eb6f858226d84cad3b4b81461e1be17ce1b664e64598 Legacy running event mnemonic: relax-cushion-holiday-airport Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 271891771 Root hash: 22a5427ce7576dd0e6bdb8cc33d23cdf056cad23ecc5fc585053e05cb6b0fafd5786d6ba60a445aef9f496e5a9fcbea5 (root) VirtualMap state / feed-exhibit-benefit-ring {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"connect-whip-stamp-above"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"idle-security-guess-torch"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"vague-palace-outside-ahead"}}}
node0 6m 13.157s 2025-11-20 18:16:02.111 9429 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+10+04.284523695Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+13+53.162136071Z_seq1_minr474_maxr5474_orgn0.pces
node0 6m 13.158s 2025-11-20 18:16:02.112 9430 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 768 File: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+13+53.162136071Z_seq1_minr474_maxr5474_orgn0.pces
node0 6m 13.158s 2025-11-20 18:16:02.112 9431 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 6m 13.163s 2025-11-20 18:16:02.117 9432 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 6m 13.163s 2025-11-20 18:16:02.117 9433 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 795 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/795 {"round":795,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/795/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 6m 13.165s 2025-11-20 18:16:02.119 9434 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/120
node1 7m 12.158s 2025-11-20 18:17:01.112 10648 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 927 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 7m 12.161s 2025-11-20 18:17:01.115 10556 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 927 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 7m 12.217s 2025-11-20 18:17:01.171 1963 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 927 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 7m 12.229s 2025-11-20 18:17:01.183 10883 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 927 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 7m 12.244s 2025-11-20 18:17:01.198 10618 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 927 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 7m 12.334s 2025-11-20 18:17:01.288 10886 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 927 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/927
node0 7m 12.335s 2025-11-20 18:17:01.289 10887 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for 927
node4 7m 12.417s 2025-11-20 18:17:01.371 1966 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 927 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/927
node4 7m 12.417s 2025-11-20 18:17:01.371 1967 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/17 for 927
node0 7m 12.419s 2025-11-20 18:17:01.373 10926 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for 927
node3 7m 12.419s 2025-11-20 18:17:01.373 10621 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 927 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/927
node3 7m 12.420s 2025-11-20 18:17:01.374 10622 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for 927
node0 7m 12.422s 2025-11-20 18:17:01.376 10927 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 927 Timestamp: 2025-11-20T18:17:00.132849Z Next consensus number: 28710 Legacy running event hash: f2efb54f38884bd08ae7c258513b57d328d421324c2b86d7bbb4f7ae2f7dce3da47b4206e22648afe6fcbc93d7327026 Legacy running event mnemonic: peasant-file-baby-key Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1400322583 Root hash: c32a4713cf65bba70e05d038d01ab4f1d9c0f7143d4d043bb40ffd9ce1ef7f6e22c21a025a22b9a059f8bcebebfd82fc (root) VirtualMap state / onion-tiny-beach-menu {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"there-mushroom-online-kid"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"head-chair-range-kind"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"bicycle-holiday-punch-biology"}}}
node0 7m 12.427s 2025-11-20 18:17:01.381 10928 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+10+04.284523695Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+13+53.162136071Z_seq1_minr474_maxr5474_orgn0.pces
node0 7m 12.428s 2025-11-20 18:17:01.382 10929 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 900 File: data/saved/preconsensus-events/0/2025/11/20/2025-11-20T18+13+53.162136071Z_seq1_minr474_maxr5474_orgn0.pces
node0 7m 12.430s 2025-11-20 18:17:01.384 10930 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 7m 12.439s 2025-11-20 18:17:01.393 10931 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 7m 12.439s 2025-11-20 18:17:01.393 10932 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 927 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/927 {"round":927,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/927/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7m 12.441s 2025-11-20 18:17:01.395 10933 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/249
node1 7m 12.460s 2025-11-20 18:17:01.414 10651 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 927 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/927
node1 7m 12.461s 2025-11-20 18:17:01.415 10652 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for 927
node2 7m 12.464s 2025-11-20 18:17:01.418 10559 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 927 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/927
node2 7m 12.465s 2025-11-20 18:17:01.419 10560 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/52 for 927
node3 7m 12.503s 2025-11-20 18:17:01.457 10653 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for 927
node3 7m 12.505s 2025-11-20 18:17:01.459 10654 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 927 Timestamp: 2025-11-20T18:17:00.132849Z Next consensus number: 28710 Legacy running event hash: f2efb54f38884bd08ae7c258513b57d328d421324c2b86d7bbb4f7ae2f7dce3da47b4206e22648afe6fcbc93d7327026 Legacy running event mnemonic: peasant-file-baby-key Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1400322583 Root hash: c32a4713cf65bba70e05d038d01ab4f1d9c0f7143d4d043bb40ffd9ce1ef7f6e22c21a025a22b9a059f8bcebebfd82fc (root) VirtualMap state / onion-tiny-beach-menu {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"there-mushroom-online-kid"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"head-chair-range-kind"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"bicycle-holiday-punch-biology"}}}
node3 7m 12.512s 2025-11-20 18:17:01.466 10655 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+13+53.099078562Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+10+04.243478241Z_seq0_minr1_maxr501_orgn0.pces
node3 7m 12.512s 2025-11-20 18:17:01.466 10656 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 900 File: data/saved/preconsensus-events/3/2025/11/20/2025-11-20T18+13+53.099078562Z_seq1_minr474_maxr5474_orgn0.pces
node3 7m 12.515s 2025-11-20 18:17:01.469 10657 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 7m 12.523s 2025-11-20 18:17:01.477 10658 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 7m 12.524s 2025-11-20 18:17:01.478 10659 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 927 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/927 {"round":927,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/927/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 7m 12.525s 2025-11-20 18:17:01.479 10660 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/249
node4 7m 12.527s 2025-11-20 18:17:01.481 2004 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/17 for 927
node4 7m 12.530s 2025-11-20 18:17:01.484 2005 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 927 Timestamp: 2025-11-20T18:17:00.132849Z Next consensus number: 28710 Legacy running event hash: f2efb54f38884bd08ae7c258513b57d328d421324c2b86d7bbb4f7ae2f7dce3da47b4206e22648afe6fcbc93d7327026 Legacy running event mnemonic: peasant-file-baby-key Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1400322583 Root hash: c32a4713cf65bba70e05d038d01ab4f1d9c0f7143d4d043bb40ffd9ce1ef7f6e22c21a025a22b9a059f8bcebebfd82fc (root) VirtualMap state / onion-tiny-beach-menu {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"there-mushroom-online-kid"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"head-chair-range-kind"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"bicycle-holiday-punch-biology"}}}
node4 7m 12.538s 2025-11-20 18:17:01.492 2006 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/11/20/2025-11-20T18+10+04.145266281Z_seq0_minr1_maxr388_orgn0.pces Last file: data/saved/preconsensus-events/4/2025/11/20/2025-11-20T18+15+53.319104201Z_seq1_minr747_maxr1247_orgn774.pces
node4 7m 12.538s 2025-11-20 18:17:01.492 2007 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 900 File: data/saved/preconsensus-events/4/2025/11/20/2025-11-20T18+15+53.319104201Z_seq1_minr747_maxr1247_orgn774.pces
node4 7m 12.538s 2025-11-20 18:17:01.492 2008 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 7m 12.541s 2025-11-20 18:17:01.495 10691 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for 927
node4 7m 12.543s 2025-11-20 18:17:01.497 2009 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 7m 12.543s 2025-11-20 18:17:01.497 2010 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 927 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/927 {"round":927,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/927/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 7m 12.544s 2025-11-20 18:17:01.498 10692 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 927 Timestamp: 2025-11-20T18:17:00.132849Z Next consensus number: 28710 Legacy running event hash: f2efb54f38884bd08ae7c258513b57d328d421324c2b86d7bbb4f7ae2f7dce3da47b4206e22648afe6fcbc93d7327026 Legacy running event mnemonic: peasant-file-baby-key Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1400322583 Root hash: c32a4713cf65bba70e05d038d01ab4f1d9c0f7143d4d043bb40ffd9ce1ef7f6e22c21a025a22b9a059f8bcebebfd82fc (root) VirtualMap state / onion-tiny-beach-menu {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"there-mushroom-online-kid"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"head-chair-range-kind"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"bicycle-holiday-punch-biology"}}}
node4 7m 12.545s 2025-11-20 18:17:01.499 2011 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/120
node1 7m 12.550s 2025-11-20 18:17:01.504 10693 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+10+03.963750774Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+13+53.150382113Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 12.550s 2025-11-20 18:17:01.504 10694 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 900 File: data/saved/preconsensus-events/1/2025/11/20/2025-11-20T18+13+53.150382113Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 12.551s 2025-11-20 18:17:01.505 10599 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/52 for 927
node1 7m 12.553s 2025-11-20 18:17:01.507 10695 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 7m 12.553s 2025-11-20 18:17:01.507 10600 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 927 Timestamp: 2025-11-20T18:17:00.132849Z Next consensus number: 28710 Legacy running event hash: f2efb54f38884bd08ae7c258513b57d328d421324c2b86d7bbb4f7ae2f7dce3da47b4206e22648afe6fcbc93d7327026 Legacy running event mnemonic: peasant-file-baby-key Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1400322583 Root hash: c32a4713cf65bba70e05d038d01ab4f1d9c0f7143d4d043bb40ffd9ce1ef7f6e22c21a025a22b9a059f8bcebebfd82fc (root) VirtualMap state / onion-tiny-beach-menu {"Queues (Queue States)":{},"VirtualMapMetadata":{"firstLeafPath":4,"lastLeafPath":8},"Singletons":{"ConsistencyTestingToolService.STATE_LONG":{"path":8,"mnemonic":"there-mushroom-online-kid"},"ConsistencyTestingToolService.ROUND_HANDLED":{"path":6,"mnemonic":"head-chair-range-kind"},"RosterService.ROSTER_STATE":{"path":5,"mnemonic":"round-forward-excite-sugar"},"PlatformStateService.PLATFORM_STATE":{"path":7,"mnemonic":"bicycle-holiday-punch-biology"}}}
node1 7m 12.561s 2025-11-20 18:17:01.515 10696 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 7m 12.561s 2025-11-20 18:17:01.515 10601 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+13+53.164255242Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+10+04.273287088Z_seq0_minr1_maxr501_orgn0.pces
node2 7m 12.561s 2025-11-20 18:17:01.515 10602 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 900 File: data/saved/preconsensus-events/2/2025/11/20/2025-11-20T18+13+53.164255242Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 12.561s 2025-11-20 18:17:01.515 10603 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 7m 12.562s 2025-11-20 18:17:01.516 10697 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 927 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/927 {"round":927,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/927/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 7m 12.563s 2025-11-20 18:17:01.517 10698 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/249
node2 7m 12.570s 2025-11-20 18:17:01.524 10604 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 7m 12.570s 2025-11-20 18:17:01.524 10605 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 927 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/927 {"round":927,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/927/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 7m 12.573s 2025-11-20 18:17:01.527 10606 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/249
node3 8.007m 2025-11-20 18:17:49.357 11796 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 3 to 1>> NetworkUtils: Connection broken: 3 <- 1
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-20T18:17:49.355703557Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node0 8.007m 2025-11-20 18:17:49.358 12039 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 0 to 1>> NetworkUtils: Connection broken: 0 -> 1
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-20T18:17:49.355597212Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node2 8.007m 2025-11-20 18:17:49.358 11742 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 2 to 1>> NetworkUtils: Connection broken: 2 <- 1
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-20T18:17:49.355804643Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node4 8.007m 2025-11-20 18:17:49.359 3138 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 4 to 1>> NetworkUtils: Connection broken: 4 <- 1
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-20T18:17:49.357871970Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node4 8.016m 2025-11-20 18:17:49.893 3150 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith0 4 to 0>> NetworkUtils: Connection broken: 4 <- 0
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-20T18:17:49.888563996Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node4 8.016m 2025-11-20 18:17:49.896 3151 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith2 4 to 2>> NetworkUtils: Connection broken: 4 <- 2
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-20T18:17:49.894630135Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node4 8m 1.101s 2025-11-20 18:17:50.055 3152 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith3 4 to 3>> NetworkUtils: Connection broken: 4 <- 3
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-20T18:17:50.053925049Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:65) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:387) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:431) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more