2026-04-23 00:26:31.761 15269 INFO oslo_service.periodic_task [-] Skipping periodic task _discover_hosts_in_cells because its interval is negative 2026-04-23 00:26:31.862 15269 CRITICAL nova [req-a562001f-0ff9-4e4f-ab72-3532141a76bd - - - - -] Unhandled error: oslo_db.exception.DBNonExistentTable: (sqlite3.OperationalError) no such table: cell_mappings [SQL: SELECT cell_mappings.created_at AS cell_mappings_created_at, cell_mappings.updated_at AS cell_mappings_updated_at, cell_mappings.id AS cell_mappings_id, cell_mappings.uuid AS cell_mappings_uuid, cell_mappings.name AS cell_mappings_name, cell_mappings.transport_url AS cell_mappings_transport_url, cell_mappings.database_connection AS cell_mappings_database_connection, cell_mappings.disabled AS cell_mappings_disabled FROM cell_mappings ORDER BY cell_mappings.id ASC] (Background on this error at: https://sqlalche.me/e/14/e3q8) 2026-04-23 00:26:31.862 15269 ERROR nova Traceback (most recent call last): 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1802, in _execute_context 2026-04-23 00:26:31.862 15269 ERROR nova self.dialect.do_execute( 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/default.py", line 732, in do_execute 2026-04-23 00:26:31.862 15269 ERROR nova cursor.execute(statement, parameters) 2026-04-23 00:26:31.862 15269 ERROR nova sqlite3.OperationalError: no such table: cell_mappings 2026-04-23 00:26:31.862 15269 ERROR nova 2026-04-23 00:26:31.862 15269 ERROR nova The above exception was the direct cause of the following exception: 2026-04-23 00:26:31.862 15269 ERROR nova 2026-04-23 00:26:31.862 15269 ERROR nova Traceback (most recent call last): 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:26:31.862 15269 ERROR nova sys.exit(main()) 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:26:31.862 15269 ERROR nova server = service.Service.create( 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:26:31.862 15269 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:26:31.862 15269 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 66, in __init__ 2026-04-23 00:26:31.862 15269 ERROR nova self.host_manager = host_manager.HostManager() 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/host_manager.py", line 334, in __init__ 2026-04-23 00:26:31.862 15269 ERROR nova self.refresh_cells_caches() 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/host_manager.py", line 718, in refresh_cells_caches 2026-04-23 00:26:31.862 15269 ERROR nova temp_cells = objects.CellMappingList.get_all(context) 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/oslo_versionedobjects/base.py", line 184, in wrapper 2026-04-23 00:26:31.862 15269 ERROR nova result = fn(cls, context, *args, **kwargs) 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/nova/objects/cell_mapping.py", line 256, in get_all 2026-04-23 00:26:31.862 15269 ERROR nova db_mappings = cls._get_all_from_db(context) 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/oslo_db/sqlalchemy/enginefacade.py", line 1010, in wrapper 2026-04-23 00:26:31.862 15269 ERROR nova return fn(*args, **kwargs) 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/nova/objects/cell_mapping.py", line 252, in _get_all_from_db 2026-04-23 00:26:31.862 15269 ERROR nova expression.asc(api_db_models.CellMapping.id)).all() 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/orm/query.py", line 2759, in all 2026-04-23 00:26:31.862 15269 ERROR nova return self._iter().all() 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/orm/query.py", line 2894, in _iter 2026-04-23 00:26:31.862 15269 ERROR nova result = self.session.execute( 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/orm/session.py", line 1692, in execute 2026-04-23 00:26:31.862 15269 ERROR nova result = conn._execute_20(statement, params or {}, execution_options) 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1614, in _execute_20 2026-04-23 00:26:31.862 15269 ERROR nova return meth(self, args_10style, kwargs_10style, execution_options) 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/sql/elements.py", line 325, in _execute_on_connection 2026-04-23 00:26:31.862 15269 ERROR nova return connection._execute_clauseelement( 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1481, in _execute_clauseelement 2026-04-23 00:26:31.862 15269 ERROR nova ret = self._execute_context( 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1845, in _execute_context 2026-04-23 00:26:31.862 15269 ERROR nova self._handle_dbapi_exception( 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 2024, in _handle_dbapi_exception 2026-04-23 00:26:31.862 15269 ERROR nova util.raise_(newraise, with_traceback=exc_info[2], from_=e) 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/util/compat.py", line 207, in raise_ 2026-04-23 00:26:31.862 15269 ERROR nova raise exception 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1802, in _execute_context 2026-04-23 00:26:31.862 15269 ERROR nova self.dialect.do_execute( 2026-04-23 00:26:31.862 15269 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/default.py", line 732, in do_execute 2026-04-23 00:26:31.862 15269 ERROR nova cursor.execute(statement, parameters) 2026-04-23 00:26:31.862 15269 ERROR nova oslo_db.exception.DBNonExistentTable: (sqlite3.OperationalError) no such table: cell_mappings 2026-04-23 00:26:31.862 15269 ERROR nova [SQL: SELECT cell_mappings.created_at AS cell_mappings_created_at, cell_mappings.updated_at AS cell_mappings_updated_at, cell_mappings.id AS cell_mappings_id, cell_mappings.uuid AS cell_mappings_uuid, cell_mappings.name AS cell_mappings_name, cell_mappings.transport_url AS cell_mappings_transport_url, cell_mappings.database_connection AS cell_mappings_database_connection, cell_mappings.disabled AS cell_mappings_disabled 2026-04-23 00:26:31.862 15269 ERROR nova FROM cell_mappings ORDER BY cell_mappings.id ASC] 2026-04-23 00:26:31.862 15269 ERROR nova (Background on this error at: https://sqlalche.me/e/14/e3q8) 2026-04-23 00:26:31.862 15269 ERROR nova 2026-04-23 00:26:37.348 15368 INFO oslo_service.periodic_task [-] Skipping periodic task _discover_hosts_in_cells because its interval is negative 2026-04-23 00:26:37.398 15368 CRITICAL nova [req-b4809b93-3a20-4719-ac40-a496f12ba91a - - - - -] Unhandled error: oslo_db.exception.DBNonExistentTable: (sqlite3.OperationalError) no such table: cell_mappings [SQL: SELECT cell_mappings.created_at AS cell_mappings_created_at, cell_mappings.updated_at AS cell_mappings_updated_at, cell_mappings.id AS cell_mappings_id, cell_mappings.uuid AS cell_mappings_uuid, cell_mappings.name AS cell_mappings_name, cell_mappings.transport_url AS cell_mappings_transport_url, cell_mappings.database_connection AS cell_mappings_database_connection, cell_mappings.disabled AS cell_mappings_disabled FROM cell_mappings ORDER BY cell_mappings.id ASC] (Background on this error at: https://sqlalche.me/e/14/e3q8) 2026-04-23 00:26:37.398 15368 ERROR nova Traceback (most recent call last): 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1802, in _execute_context 2026-04-23 00:26:37.398 15368 ERROR nova self.dialect.do_execute( 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/default.py", line 732, in do_execute 2026-04-23 00:26:37.398 15368 ERROR nova cursor.execute(statement, parameters) 2026-04-23 00:26:37.398 15368 ERROR nova sqlite3.OperationalError: no such table: cell_mappings 2026-04-23 00:26:37.398 15368 ERROR nova 2026-04-23 00:26:37.398 15368 ERROR nova The above exception was the direct cause of the following exception: 2026-04-23 00:26:37.398 15368 ERROR nova 2026-04-23 00:26:37.398 15368 ERROR nova Traceback (most recent call last): 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:26:37.398 15368 ERROR nova sys.exit(main()) 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:26:37.398 15368 ERROR nova server = service.Service.create( 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:26:37.398 15368 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:26:37.398 15368 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 66, in __init__ 2026-04-23 00:26:37.398 15368 ERROR nova self.host_manager = host_manager.HostManager() 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/host_manager.py", line 334, in __init__ 2026-04-23 00:26:37.398 15368 ERROR nova self.refresh_cells_caches() 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/host_manager.py", line 718, in refresh_cells_caches 2026-04-23 00:26:37.398 15368 ERROR nova temp_cells = objects.CellMappingList.get_all(context) 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/oslo_versionedobjects/base.py", line 184, in wrapper 2026-04-23 00:26:37.398 15368 ERROR nova result = fn(cls, context, *args, **kwargs) 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/nova/objects/cell_mapping.py", line 256, in get_all 2026-04-23 00:26:37.398 15368 ERROR nova db_mappings = cls._get_all_from_db(context) 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/oslo_db/sqlalchemy/enginefacade.py", line 1010, in wrapper 2026-04-23 00:26:37.398 15368 ERROR nova return fn(*args, **kwargs) 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/nova/objects/cell_mapping.py", line 252, in _get_all_from_db 2026-04-23 00:26:37.398 15368 ERROR nova expression.asc(api_db_models.CellMapping.id)).all() 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/orm/query.py", line 2759, in all 2026-04-23 00:26:37.398 15368 ERROR nova return self._iter().all() 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/orm/query.py", line 2894, in _iter 2026-04-23 00:26:37.398 15368 ERROR nova result = self.session.execute( 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/orm/session.py", line 1692, in execute 2026-04-23 00:26:37.398 15368 ERROR nova result = conn._execute_20(statement, params or {}, execution_options) 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1614, in _execute_20 2026-04-23 00:26:37.398 15368 ERROR nova return meth(self, args_10style, kwargs_10style, execution_options) 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/sql/elements.py", line 325, in _execute_on_connection 2026-04-23 00:26:37.398 15368 ERROR nova return connection._execute_clauseelement( 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1481, in _execute_clauseelement 2026-04-23 00:26:37.398 15368 ERROR nova ret = self._execute_context( 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1845, in _execute_context 2026-04-23 00:26:37.398 15368 ERROR nova self._handle_dbapi_exception( 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 2024, in _handle_dbapi_exception 2026-04-23 00:26:37.398 15368 ERROR nova util.raise_(newraise, with_traceback=exc_info[2], from_=e) 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/util/compat.py", line 207, in raise_ 2026-04-23 00:26:37.398 15368 ERROR nova raise exception 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1802, in _execute_context 2026-04-23 00:26:37.398 15368 ERROR nova self.dialect.do_execute( 2026-04-23 00:26:37.398 15368 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/default.py", line 732, in do_execute 2026-04-23 00:26:37.398 15368 ERROR nova cursor.execute(statement, parameters) 2026-04-23 00:26:37.398 15368 ERROR nova oslo_db.exception.DBNonExistentTable: (sqlite3.OperationalError) no such table: cell_mappings 2026-04-23 00:26:37.398 15368 ERROR nova [SQL: SELECT cell_mappings.created_at AS cell_mappings_created_at, cell_mappings.updated_at AS cell_mappings_updated_at, cell_mappings.id AS cell_mappings_id, cell_mappings.uuid AS cell_mappings_uuid, cell_mappings.name AS cell_mappings_name, cell_mappings.transport_url AS cell_mappings_transport_url, cell_mappings.database_connection AS cell_mappings_database_connection, cell_mappings.disabled AS cell_mappings_disabled 2026-04-23 00:26:37.398 15368 ERROR nova FROM cell_mappings ORDER BY cell_mappings.id ASC] 2026-04-23 00:26:37.398 15368 ERROR nova (Background on this error at: https://sqlalche.me/e/14/e3q8) 2026-04-23 00:26:37.398 15368 ERROR nova 2026-04-23 00:38:52.766 110166 DEBUG oslo_db.sqlalchemy.engines [req-f72a1ca1-8c96-4b24-9c3f-910f90e3e515 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:38:52.799 110166 DEBUG nova.scheduler.host_manager [req-f72a1ca1-8c96-4b24-9c3f-910f90e3e515 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:38:52.800 110166 DEBUG nova.scheduler.host_manager [req-f72a1ca1-8c96-4b24-9c3f-910f90e3e515 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:38:52.867 110166 WARNING nova.scheduler.filters.availability_zone_filter [req-f72a1ca1-8c96-4b24-9c3f-910f90e3e515 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:38:52.884 110166 ERROR nova.scheduler.client.report [req-f72a1ca1-8c96-4b24-9c3f-910f90e3e515 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:38:52.885 110166 CRITICAL nova [req-f72a1ca1-8c96-4b24-9c3f-910f90e3e515 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:38:52.885 110166 ERROR nova Traceback (most recent call last): 2026-04-23 00:38:52.885 110166 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:38:52.885 110166 ERROR nova sys.exit(main()) 2026-04-23 00:38:52.885 110166 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:38:52.885 110166 ERROR nova server = service.Service.create( 2026-04-23 00:38:52.885 110166 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:38:52.885 110166 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:38:52.885 110166 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:38:52.885 110166 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:38:52.885 110166 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:38:52.885 110166 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:38:52.885 110166 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:38:52.885 110166 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:38:52.885 110166 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:38:52.885 110166 ERROR nova self._client = self._create_client() 2026-04-23 00:38:52.885 110166 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:38:52.885 110166 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:38:52.885 110166 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:38:52.885 110166 ERROR nova return getattr(conn, service_type) 2026-04-23 00:38:52.885 110166 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:38:52.885 110166 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:38:52.885 110166 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:38:52.885 110166 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:38:52.885 110166 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:38:52.885 110166 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:38:52.885 110166 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:38:52.885 110166 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:38:52.885 110166 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:38:52.885 110166 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:38:52.885 110166 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:38:52.885 110166 ERROR nova 2026-04-23 00:38:55.170 110700 DEBUG oslo_db.sqlalchemy.engines [req-04233154-2683-4289-8462-fcc5782f0656 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:38:55.204 110700 DEBUG nova.scheduler.host_manager [req-04233154-2683-4289-8462-fcc5782f0656 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:38:55.204 110700 DEBUG nova.scheduler.host_manager [req-04233154-2683-4289-8462-fcc5782f0656 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:38:55.209 110700 WARNING nova.scheduler.filters.availability_zone_filter [req-04233154-2683-4289-8462-fcc5782f0656 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:38:55.297 110700 ERROR nova.scheduler.client.report [req-04233154-2683-4289-8462-fcc5782f0656 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:38:55.298 110700 CRITICAL nova [req-04233154-2683-4289-8462-fcc5782f0656 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:38:55.298 110700 ERROR nova Traceback (most recent call last): 2026-04-23 00:38:55.298 110700 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:38:55.298 110700 ERROR nova sys.exit(main()) 2026-04-23 00:38:55.298 110700 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:38:55.298 110700 ERROR nova server = service.Service.create( 2026-04-23 00:38:55.298 110700 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:38:55.298 110700 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:38:55.298 110700 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:38:55.298 110700 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:38:55.298 110700 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:38:55.298 110700 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:38:55.298 110700 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:38:55.298 110700 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:38:55.298 110700 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:38:55.298 110700 ERROR nova self._client = self._create_client() 2026-04-23 00:38:55.298 110700 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:38:55.298 110700 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:38:55.298 110700 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:38:55.298 110700 ERROR nova return getattr(conn, service_type) 2026-04-23 00:38:55.298 110700 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:38:55.298 110700 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:38:55.298 110700 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:38:55.298 110700 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:38:55.298 110700 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:38:55.298 110700 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:38:55.298 110700 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:38:55.298 110700 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:38:55.298 110700 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:38:55.298 110700 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:38:55.298 110700 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:38:55.298 110700 ERROR nova 2026-04-23 00:38:57.358 111168 DEBUG oslo_db.sqlalchemy.engines [req-99d3125e-8cf7-4374-85f9-f3712067d2ad - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:38:57.394 111168 DEBUG nova.scheduler.host_manager [req-99d3125e-8cf7-4374-85f9-f3712067d2ad - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:38:57.394 111168 DEBUG nova.scheduler.host_manager [req-99d3125e-8cf7-4374-85f9-f3712067d2ad - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:38:57.399 111168 WARNING nova.scheduler.filters.availability_zone_filter [req-99d3125e-8cf7-4374-85f9-f3712067d2ad - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:38:57.481 111168 ERROR nova.scheduler.client.report [req-99d3125e-8cf7-4374-85f9-f3712067d2ad - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:38:57.482 111168 CRITICAL nova [req-99d3125e-8cf7-4374-85f9-f3712067d2ad - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:38:57.482 111168 ERROR nova Traceback (most recent call last): 2026-04-23 00:38:57.482 111168 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:38:57.482 111168 ERROR nova sys.exit(main()) 2026-04-23 00:38:57.482 111168 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:38:57.482 111168 ERROR nova server = service.Service.create( 2026-04-23 00:38:57.482 111168 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:38:57.482 111168 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:38:57.482 111168 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:38:57.482 111168 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:38:57.482 111168 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:38:57.482 111168 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:38:57.482 111168 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:38:57.482 111168 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:38:57.482 111168 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:38:57.482 111168 ERROR nova self._client = self._create_client() 2026-04-23 00:38:57.482 111168 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:38:57.482 111168 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:38:57.482 111168 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:38:57.482 111168 ERROR nova return getattr(conn, service_type) 2026-04-23 00:38:57.482 111168 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:38:57.482 111168 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:38:57.482 111168 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:38:57.482 111168 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:38:57.482 111168 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:38:57.482 111168 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:38:57.482 111168 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:38:57.482 111168 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:38:57.482 111168 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:38:57.482 111168 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:38:57.482 111168 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:38:57.482 111168 ERROR nova 2026-04-23 00:38:59.524 111179 DEBUG oslo_db.sqlalchemy.engines [req-1895c1da-007a-4890-8acf-f22d463e65dd - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:38:59.556 111179 DEBUG nova.scheduler.host_manager [req-1895c1da-007a-4890-8acf-f22d463e65dd - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:38:59.557 111179 DEBUG nova.scheduler.host_manager [req-1895c1da-007a-4890-8acf-f22d463e65dd - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:38:59.561 111179 WARNING nova.scheduler.filters.availability_zone_filter [req-1895c1da-007a-4890-8acf-f22d463e65dd - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:38:59.650 111179 ERROR nova.scheduler.client.report [req-1895c1da-007a-4890-8acf-f22d463e65dd - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:38:59.651 111179 CRITICAL nova [req-1895c1da-007a-4890-8acf-f22d463e65dd - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:38:59.651 111179 ERROR nova Traceback (most recent call last): 2026-04-23 00:38:59.651 111179 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:38:59.651 111179 ERROR nova sys.exit(main()) 2026-04-23 00:38:59.651 111179 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:38:59.651 111179 ERROR nova server = service.Service.create( 2026-04-23 00:38:59.651 111179 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:38:59.651 111179 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:38:59.651 111179 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:38:59.651 111179 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:38:59.651 111179 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:38:59.651 111179 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:38:59.651 111179 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:38:59.651 111179 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:38:59.651 111179 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:38:59.651 111179 ERROR nova self._client = self._create_client() 2026-04-23 00:38:59.651 111179 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:38:59.651 111179 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:38:59.651 111179 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:38:59.651 111179 ERROR nova return getattr(conn, service_type) 2026-04-23 00:38:59.651 111179 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:38:59.651 111179 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:38:59.651 111179 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:38:59.651 111179 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:38:59.651 111179 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:38:59.651 111179 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:38:59.651 111179 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:38:59.651 111179 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:38:59.651 111179 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:38:59.651 111179 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:38:59.651 111179 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:38:59.651 111179 ERROR nova 2026-04-23 00:39:01.898 111191 DEBUG oslo_db.sqlalchemy.engines [req-2b4a29cf-2195-44eb-bf09-406bc15baf00 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:01.942 111191 DEBUG nova.scheduler.host_manager [req-2b4a29cf-2195-44eb-bf09-406bc15baf00 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:01.942 111191 DEBUG nova.scheduler.host_manager [req-2b4a29cf-2195-44eb-bf09-406bc15baf00 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:01.946 111191 WARNING nova.scheduler.filters.availability_zone_filter [req-2b4a29cf-2195-44eb-bf09-406bc15baf00 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:02.030 111191 ERROR nova.scheduler.client.report [req-2b4a29cf-2195-44eb-bf09-406bc15baf00 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:02.030 111191 CRITICAL nova [req-2b4a29cf-2195-44eb-bf09-406bc15baf00 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:02.030 111191 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:02.030 111191 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:02.030 111191 ERROR nova sys.exit(main()) 2026-04-23 00:39:02.030 111191 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:02.030 111191 ERROR nova server = service.Service.create( 2026-04-23 00:39:02.030 111191 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:02.030 111191 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:02.030 111191 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:02.030 111191 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:02.030 111191 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:02.030 111191 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:02.030 111191 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:02.030 111191 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:02.030 111191 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:02.030 111191 ERROR nova self._client = self._create_client() 2026-04-23 00:39:02.030 111191 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:02.030 111191 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:02.030 111191 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:02.030 111191 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:02.030 111191 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:02.030 111191 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:02.030 111191 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:02.030 111191 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:02.030 111191 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:02.030 111191 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:02.030 111191 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:02.030 111191 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:02.030 111191 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:02.030 111191 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:02.030 111191 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:02.030 111191 ERROR nova 2026-04-23 00:39:04.170 111219 DEBUG oslo_db.sqlalchemy.engines [req-7bb56d87-77cd-437d-bef6-0c081d7ad814 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:04.203 111219 DEBUG nova.scheduler.host_manager [req-7bb56d87-77cd-437d-bef6-0c081d7ad814 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:04.204 111219 DEBUG nova.scheduler.host_manager [req-7bb56d87-77cd-437d-bef6-0c081d7ad814 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:04.208 111219 WARNING nova.scheduler.filters.availability_zone_filter [req-7bb56d87-77cd-437d-bef6-0c081d7ad814 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:04.283 111219 ERROR nova.scheduler.client.report [req-7bb56d87-77cd-437d-bef6-0c081d7ad814 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:04.284 111219 CRITICAL nova [req-7bb56d87-77cd-437d-bef6-0c081d7ad814 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:04.284 111219 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:04.284 111219 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:04.284 111219 ERROR nova sys.exit(main()) 2026-04-23 00:39:04.284 111219 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:04.284 111219 ERROR nova server = service.Service.create( 2026-04-23 00:39:04.284 111219 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:04.284 111219 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:04.284 111219 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:04.284 111219 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:04.284 111219 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:04.284 111219 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:04.284 111219 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:04.284 111219 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:04.284 111219 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:04.284 111219 ERROR nova self._client = self._create_client() 2026-04-23 00:39:04.284 111219 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:04.284 111219 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:04.284 111219 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:04.284 111219 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:04.284 111219 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:04.284 111219 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:04.284 111219 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:04.284 111219 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:04.284 111219 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:04.284 111219 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:04.284 111219 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:04.284 111219 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:04.284 111219 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:04.284 111219 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:04.284 111219 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:04.284 111219 ERROR nova 2026-04-23 00:39:06.346 111231 DEBUG oslo_db.sqlalchemy.engines [req-3442aeb2-66a8-4950-88a4-595f241d36b3 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:06.388 111231 DEBUG nova.scheduler.host_manager [req-3442aeb2-66a8-4950-88a4-595f241d36b3 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:06.388 111231 DEBUG nova.scheduler.host_manager [req-3442aeb2-66a8-4950-88a4-595f241d36b3 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:06.392 111231 WARNING nova.scheduler.filters.availability_zone_filter [req-3442aeb2-66a8-4950-88a4-595f241d36b3 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:06.471 111231 ERROR nova.scheduler.client.report [req-3442aeb2-66a8-4950-88a4-595f241d36b3 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:06.471 111231 CRITICAL nova [req-3442aeb2-66a8-4950-88a4-595f241d36b3 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:06.471 111231 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:06.471 111231 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:06.471 111231 ERROR nova sys.exit(main()) 2026-04-23 00:39:06.471 111231 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:06.471 111231 ERROR nova server = service.Service.create( 2026-04-23 00:39:06.471 111231 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:06.471 111231 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:06.471 111231 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:06.471 111231 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:06.471 111231 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:06.471 111231 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:06.471 111231 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:06.471 111231 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:06.471 111231 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:06.471 111231 ERROR nova self._client = self._create_client() 2026-04-23 00:39:06.471 111231 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:06.471 111231 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:06.471 111231 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:06.471 111231 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:06.471 111231 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:06.471 111231 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:06.471 111231 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:06.471 111231 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:06.471 111231 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:06.471 111231 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:06.471 111231 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:06.471 111231 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:06.471 111231 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:06.471 111231 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:06.471 111231 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:06.471 111231 ERROR nova 2026-04-23 00:39:08.424 111458 DEBUG oslo_db.sqlalchemy.engines [req-3a0667c6-6a7d-4f9c-ac01-d03f5bf99c1f - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:08.458 111458 DEBUG nova.scheduler.host_manager [req-3a0667c6-6a7d-4f9c-ac01-d03f5bf99c1f - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:08.459 111458 DEBUG nova.scheduler.host_manager [req-3a0667c6-6a7d-4f9c-ac01-d03f5bf99c1f - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:08.463 111458 WARNING nova.scheduler.filters.availability_zone_filter [req-3a0667c6-6a7d-4f9c-ac01-d03f5bf99c1f - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:08.547 111458 ERROR nova.scheduler.client.report [req-3a0667c6-6a7d-4f9c-ac01-d03f5bf99c1f - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:08.547 111458 CRITICAL nova [req-3a0667c6-6a7d-4f9c-ac01-d03f5bf99c1f - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:08.547 111458 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:08.547 111458 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:08.547 111458 ERROR nova sys.exit(main()) 2026-04-23 00:39:08.547 111458 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:08.547 111458 ERROR nova server = service.Service.create( 2026-04-23 00:39:08.547 111458 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:08.547 111458 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:08.547 111458 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:08.547 111458 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:08.547 111458 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:08.547 111458 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:08.547 111458 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:08.547 111458 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:08.547 111458 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:08.547 111458 ERROR nova self._client = self._create_client() 2026-04-23 00:39:08.547 111458 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:08.547 111458 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:08.547 111458 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:08.547 111458 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:08.547 111458 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:08.547 111458 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:08.547 111458 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:08.547 111458 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:08.547 111458 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:08.547 111458 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:08.547 111458 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:08.547 111458 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:08.547 111458 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:08.547 111458 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:08.547 111458 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:08.547 111458 ERROR nova 2026-04-23 00:39:10.735 111780 DEBUG oslo_db.sqlalchemy.engines [req-f3da440c-3812-406b-878a-84ce78af52c0 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:10.768 111780 DEBUG nova.scheduler.host_manager [req-f3da440c-3812-406b-878a-84ce78af52c0 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:10.769 111780 DEBUG nova.scheduler.host_manager [req-f3da440c-3812-406b-878a-84ce78af52c0 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:10.775 111780 WARNING nova.scheduler.filters.availability_zone_filter [req-f3da440c-3812-406b-878a-84ce78af52c0 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:10.871 111780 ERROR nova.scheduler.client.report [req-f3da440c-3812-406b-878a-84ce78af52c0 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:10.872 111780 CRITICAL nova [req-f3da440c-3812-406b-878a-84ce78af52c0 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:10.872 111780 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:10.872 111780 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:10.872 111780 ERROR nova sys.exit(main()) 2026-04-23 00:39:10.872 111780 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:10.872 111780 ERROR nova server = service.Service.create( 2026-04-23 00:39:10.872 111780 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:10.872 111780 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:10.872 111780 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:10.872 111780 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:10.872 111780 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:10.872 111780 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:10.872 111780 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:10.872 111780 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:10.872 111780 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:10.872 111780 ERROR nova self._client = self._create_client() 2026-04-23 00:39:10.872 111780 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:10.872 111780 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:10.872 111780 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:10.872 111780 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:10.872 111780 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:10.872 111780 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:10.872 111780 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:10.872 111780 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:10.872 111780 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:10.872 111780 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:10.872 111780 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:10.872 111780 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:10.872 111780 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:10.872 111780 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:10.872 111780 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:10.872 111780 ERROR nova 2026-04-23 00:39:13.087 112260 DEBUG oslo_db.sqlalchemy.engines [req-d82a6648-ebb0-4d1c-ae1d-284808cfa042 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:13.125 112260 DEBUG nova.scheduler.host_manager [req-d82a6648-ebb0-4d1c-ae1d-284808cfa042 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:13.126 112260 DEBUG nova.scheduler.host_manager [req-d82a6648-ebb0-4d1c-ae1d-284808cfa042 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:13.129 112260 WARNING nova.scheduler.filters.availability_zone_filter [req-d82a6648-ebb0-4d1c-ae1d-284808cfa042 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:13.206 112260 ERROR nova.scheduler.client.report [req-d82a6648-ebb0-4d1c-ae1d-284808cfa042 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:13.207 112260 CRITICAL nova [req-d82a6648-ebb0-4d1c-ae1d-284808cfa042 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:13.207 112260 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:13.207 112260 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:13.207 112260 ERROR nova sys.exit(main()) 2026-04-23 00:39:13.207 112260 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:13.207 112260 ERROR nova server = service.Service.create( 2026-04-23 00:39:13.207 112260 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:13.207 112260 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:13.207 112260 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:13.207 112260 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:13.207 112260 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:13.207 112260 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:13.207 112260 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:13.207 112260 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:13.207 112260 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:13.207 112260 ERROR nova self._client = self._create_client() 2026-04-23 00:39:13.207 112260 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:13.207 112260 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:13.207 112260 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:13.207 112260 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:13.207 112260 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:13.207 112260 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:13.207 112260 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:13.207 112260 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:13.207 112260 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:13.207 112260 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:13.207 112260 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:13.207 112260 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:13.207 112260 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:13.207 112260 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:13.207 112260 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:13.207 112260 ERROR nova 2026-04-23 00:39:15.384 112270 DEBUG oslo_db.sqlalchemy.engines [req-6fa2a9ad-f3d6-4d83-8ff5-adada16807f1 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:15.414 112270 DEBUG nova.scheduler.host_manager [req-6fa2a9ad-f3d6-4d83-8ff5-adada16807f1 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:15.414 112270 DEBUG nova.scheduler.host_manager [req-6fa2a9ad-f3d6-4d83-8ff5-adada16807f1 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:15.418 112270 WARNING nova.scheduler.filters.availability_zone_filter [req-6fa2a9ad-f3d6-4d83-8ff5-adada16807f1 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:15.502 112270 ERROR nova.scheduler.client.report [req-6fa2a9ad-f3d6-4d83-8ff5-adada16807f1 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:15.503 112270 CRITICAL nova [req-6fa2a9ad-f3d6-4d83-8ff5-adada16807f1 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:15.503 112270 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:15.503 112270 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:15.503 112270 ERROR nova sys.exit(main()) 2026-04-23 00:39:15.503 112270 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:15.503 112270 ERROR nova server = service.Service.create( 2026-04-23 00:39:15.503 112270 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:15.503 112270 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:15.503 112270 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:15.503 112270 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:15.503 112270 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:15.503 112270 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:15.503 112270 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:15.503 112270 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:15.503 112270 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:15.503 112270 ERROR nova self._client = self._create_client() 2026-04-23 00:39:15.503 112270 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:15.503 112270 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:15.503 112270 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:15.503 112270 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:15.503 112270 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:15.503 112270 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:15.503 112270 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:15.503 112270 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:15.503 112270 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:15.503 112270 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:15.503 112270 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:15.503 112270 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:15.503 112270 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:15.503 112270 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:15.503 112270 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:15.503 112270 ERROR nova 2026-04-23 00:39:17.663 112280 DEBUG oslo_db.sqlalchemy.engines [req-2f22b3e5-0424-4162-a6d0-1487647d832e - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:17.696 112280 DEBUG nova.scheduler.host_manager [req-2f22b3e5-0424-4162-a6d0-1487647d832e - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:17.697 112280 DEBUG nova.scheduler.host_manager [req-2f22b3e5-0424-4162-a6d0-1487647d832e - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:17.701 112280 WARNING nova.scheduler.filters.availability_zone_filter [req-2f22b3e5-0424-4162-a6d0-1487647d832e - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:17.780 112280 ERROR nova.scheduler.client.report [req-2f22b3e5-0424-4162-a6d0-1487647d832e - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:17.781 112280 CRITICAL nova [req-2f22b3e5-0424-4162-a6d0-1487647d832e - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:17.781 112280 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:17.781 112280 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:17.781 112280 ERROR nova sys.exit(main()) 2026-04-23 00:39:17.781 112280 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:17.781 112280 ERROR nova server = service.Service.create( 2026-04-23 00:39:17.781 112280 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:17.781 112280 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:17.781 112280 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:17.781 112280 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:17.781 112280 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:17.781 112280 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:17.781 112280 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:17.781 112280 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:17.781 112280 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:17.781 112280 ERROR nova self._client = self._create_client() 2026-04-23 00:39:17.781 112280 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:17.781 112280 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:17.781 112280 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:17.781 112280 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:17.781 112280 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:17.781 112280 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:17.781 112280 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:17.781 112280 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:17.781 112280 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:17.781 112280 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:17.781 112280 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:17.781 112280 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:17.781 112280 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:17.781 112280 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:17.781 112280 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:17.781 112280 ERROR nova 2026-04-23 00:39:19.857 112290 DEBUG oslo_db.sqlalchemy.engines [req-165290fc-f327-495a-972f-95d9b1e678a7 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:19.889 112290 DEBUG nova.scheduler.host_manager [req-165290fc-f327-495a-972f-95d9b1e678a7 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:19.889 112290 DEBUG nova.scheduler.host_manager [req-165290fc-f327-495a-972f-95d9b1e678a7 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:19.893 112290 WARNING nova.scheduler.filters.availability_zone_filter [req-165290fc-f327-495a-972f-95d9b1e678a7 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:19.972 112290 ERROR nova.scheduler.client.report [req-165290fc-f327-495a-972f-95d9b1e678a7 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:19.973 112290 CRITICAL nova [req-165290fc-f327-495a-972f-95d9b1e678a7 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:19.973 112290 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:19.973 112290 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:19.973 112290 ERROR nova sys.exit(main()) 2026-04-23 00:39:19.973 112290 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:19.973 112290 ERROR nova server = service.Service.create( 2026-04-23 00:39:19.973 112290 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:19.973 112290 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:19.973 112290 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:19.973 112290 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:19.973 112290 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:19.973 112290 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:19.973 112290 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:19.973 112290 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:19.973 112290 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:19.973 112290 ERROR nova self._client = self._create_client() 2026-04-23 00:39:19.973 112290 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:19.973 112290 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:19.973 112290 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:19.973 112290 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:19.973 112290 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:19.973 112290 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:19.973 112290 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:19.973 112290 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:19.973 112290 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:19.973 112290 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:19.973 112290 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:19.973 112290 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:19.973 112290 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:19.973 112290 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:19.973 112290 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:19.973 112290 ERROR nova 2026-04-23 00:39:22.062 112300 DEBUG oslo_db.sqlalchemy.engines [req-85295874-9d54-40a3-977a-23699a5a223b - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:22.104 112300 DEBUG nova.scheduler.host_manager [req-85295874-9d54-40a3-977a-23699a5a223b - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:22.104 112300 DEBUG nova.scheduler.host_manager [req-85295874-9d54-40a3-977a-23699a5a223b - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:22.108 112300 WARNING nova.scheduler.filters.availability_zone_filter [req-85295874-9d54-40a3-977a-23699a5a223b - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:22.182 112300 ERROR nova.scheduler.client.report [req-85295874-9d54-40a3-977a-23699a5a223b - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:22.183 112300 CRITICAL nova [req-85295874-9d54-40a3-977a-23699a5a223b - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:22.183 112300 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:22.183 112300 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:22.183 112300 ERROR nova sys.exit(main()) 2026-04-23 00:39:22.183 112300 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:22.183 112300 ERROR nova server = service.Service.create( 2026-04-23 00:39:22.183 112300 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:22.183 112300 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:22.183 112300 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:22.183 112300 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:22.183 112300 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:22.183 112300 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:22.183 112300 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:22.183 112300 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:22.183 112300 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:22.183 112300 ERROR nova self._client = self._create_client() 2026-04-23 00:39:22.183 112300 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:22.183 112300 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:22.183 112300 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:22.183 112300 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:22.183 112300 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:22.183 112300 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:22.183 112300 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:22.183 112300 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:22.183 112300 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:22.183 112300 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:22.183 112300 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:22.183 112300 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:22.183 112300 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:22.183 112300 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:22.183 112300 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:22.183 112300 ERROR nova 2026-04-23 00:39:24.276 112310 DEBUG oslo_db.sqlalchemy.engines [req-b95eb569-2215-438d-83d9-c2e7a561bbb8 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:24.308 112310 DEBUG nova.scheduler.host_manager [req-b95eb569-2215-438d-83d9-c2e7a561bbb8 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:24.308 112310 DEBUG nova.scheduler.host_manager [req-b95eb569-2215-438d-83d9-c2e7a561bbb8 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:24.312 112310 WARNING nova.scheduler.filters.availability_zone_filter [req-b95eb569-2215-438d-83d9-c2e7a561bbb8 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:24.386 112310 ERROR nova.scheduler.client.report [req-b95eb569-2215-438d-83d9-c2e7a561bbb8 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:24.387 112310 CRITICAL nova [req-b95eb569-2215-438d-83d9-c2e7a561bbb8 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:24.387 112310 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:24.387 112310 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:24.387 112310 ERROR nova sys.exit(main()) 2026-04-23 00:39:24.387 112310 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:24.387 112310 ERROR nova server = service.Service.create( 2026-04-23 00:39:24.387 112310 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:24.387 112310 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:24.387 112310 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:24.387 112310 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:24.387 112310 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:24.387 112310 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:24.387 112310 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:24.387 112310 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:24.387 112310 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:24.387 112310 ERROR nova self._client = self._create_client() 2026-04-23 00:39:24.387 112310 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:24.387 112310 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:24.387 112310 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:24.387 112310 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:24.387 112310 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:24.387 112310 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:24.387 112310 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:24.387 112310 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:24.387 112310 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:24.387 112310 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:24.387 112310 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:24.387 112310 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:24.387 112310 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:24.387 112310 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:24.387 112310 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:24.387 112310 ERROR nova 2026-04-23 00:39:26.556 112320 DEBUG oslo_db.sqlalchemy.engines [req-f43184ab-27df-4788-825d-0ad4a8ab96fe - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:26.589 112320 DEBUG nova.scheduler.host_manager [req-f43184ab-27df-4788-825d-0ad4a8ab96fe - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:26.590 112320 DEBUG nova.scheduler.host_manager [req-f43184ab-27df-4788-825d-0ad4a8ab96fe - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:26.594 112320 WARNING nova.scheduler.filters.availability_zone_filter [req-f43184ab-27df-4788-825d-0ad4a8ab96fe - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:26.677 112320 ERROR nova.scheduler.client.report [req-f43184ab-27df-4788-825d-0ad4a8ab96fe - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:26.677 112320 CRITICAL nova [req-f43184ab-27df-4788-825d-0ad4a8ab96fe - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:26.677 112320 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:26.677 112320 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:26.677 112320 ERROR nova sys.exit(main()) 2026-04-23 00:39:26.677 112320 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:26.677 112320 ERROR nova server = service.Service.create( 2026-04-23 00:39:26.677 112320 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:26.677 112320 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:26.677 112320 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:26.677 112320 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:26.677 112320 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:26.677 112320 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:26.677 112320 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:26.677 112320 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:26.677 112320 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:26.677 112320 ERROR nova self._client = self._create_client() 2026-04-23 00:39:26.677 112320 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:26.677 112320 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:26.677 112320 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:26.677 112320 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:26.677 112320 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:26.677 112320 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:26.677 112320 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:26.677 112320 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:26.677 112320 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:26.677 112320 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:26.677 112320 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:26.677 112320 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:26.677 112320 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:26.677 112320 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:26.677 112320 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:26.677 112320 ERROR nova 2026-04-23 00:39:28.902 112331 DEBUG oslo_db.sqlalchemy.engines [req-f01170a3-b953-4f76-8fd6-76a8847b385b - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:28.934 112331 DEBUG nova.scheduler.host_manager [req-f01170a3-b953-4f76-8fd6-76a8847b385b - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:28.934 112331 DEBUG nova.scheduler.host_manager [req-f01170a3-b953-4f76-8fd6-76a8847b385b - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:28.939 112331 WARNING nova.scheduler.filters.availability_zone_filter [req-f01170a3-b953-4f76-8fd6-76a8847b385b - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:29.018 112331 ERROR nova.scheduler.client.report [req-f01170a3-b953-4f76-8fd6-76a8847b385b - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:29.019 112331 CRITICAL nova [req-f01170a3-b953-4f76-8fd6-76a8847b385b - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:29.019 112331 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:29.019 112331 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:29.019 112331 ERROR nova sys.exit(main()) 2026-04-23 00:39:29.019 112331 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:29.019 112331 ERROR nova server = service.Service.create( 2026-04-23 00:39:29.019 112331 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:29.019 112331 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:29.019 112331 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:29.019 112331 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:29.019 112331 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:29.019 112331 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:29.019 112331 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:29.019 112331 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:29.019 112331 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:29.019 112331 ERROR nova self._client = self._create_client() 2026-04-23 00:39:29.019 112331 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:29.019 112331 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:29.019 112331 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:29.019 112331 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:29.019 112331 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:29.019 112331 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:29.019 112331 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:29.019 112331 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:29.019 112331 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:29.019 112331 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:29.019 112331 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:29.019 112331 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:29.019 112331 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:29.019 112331 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:29.019 112331 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:29.019 112331 ERROR nova 2026-04-23 00:39:31.079 112341 DEBUG oslo_db.sqlalchemy.engines [req-e462a008-caa6-434b-88c3-6a9d10cf2db7 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:31.112 112341 DEBUG nova.scheduler.host_manager [req-e462a008-caa6-434b-88c3-6a9d10cf2db7 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:31.112 112341 DEBUG nova.scheduler.host_manager [req-e462a008-caa6-434b-88c3-6a9d10cf2db7 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:31.116 112341 WARNING nova.scheduler.filters.availability_zone_filter [req-e462a008-caa6-434b-88c3-6a9d10cf2db7 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:31.199 112341 ERROR nova.scheduler.client.report [req-e462a008-caa6-434b-88c3-6a9d10cf2db7 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:31.200 112341 CRITICAL nova [req-e462a008-caa6-434b-88c3-6a9d10cf2db7 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:31.200 112341 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:31.200 112341 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:31.200 112341 ERROR nova sys.exit(main()) 2026-04-23 00:39:31.200 112341 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:31.200 112341 ERROR nova server = service.Service.create( 2026-04-23 00:39:31.200 112341 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:31.200 112341 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:31.200 112341 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:31.200 112341 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:31.200 112341 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:31.200 112341 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:31.200 112341 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:31.200 112341 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:31.200 112341 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:31.200 112341 ERROR nova self._client = self._create_client() 2026-04-23 00:39:31.200 112341 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:31.200 112341 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:31.200 112341 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:31.200 112341 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:31.200 112341 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:31.200 112341 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:31.200 112341 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:31.200 112341 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:31.200 112341 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:31.200 112341 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:31.200 112341 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:31.200 112341 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:31.200 112341 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:31.200 112341 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:31.200 112341 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:31.200 112341 ERROR nova 2026-04-23 00:39:33.859 112351 DEBUG oslo_db.sqlalchemy.engines [req-bf8968b2-dcdc-4b25-bf1c-aac7f0373f53 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:33.945 112351 DEBUG nova.scheduler.host_manager [req-bf8968b2-dcdc-4b25-bf1c-aac7f0373f53 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:33.948 112351 DEBUG nova.scheduler.host_manager [req-bf8968b2-dcdc-4b25-bf1c-aac7f0373f53 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:33.954 112351 WARNING nova.scheduler.filters.availability_zone_filter [req-bf8968b2-dcdc-4b25-bf1c-aac7f0373f53 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:34.072 112351 ERROR nova.scheduler.client.report [req-bf8968b2-dcdc-4b25-bf1c-aac7f0373f53 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:34.073 112351 CRITICAL nova [req-bf8968b2-dcdc-4b25-bf1c-aac7f0373f53 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:34.073 112351 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:34.073 112351 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:34.073 112351 ERROR nova sys.exit(main()) 2026-04-23 00:39:34.073 112351 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:34.073 112351 ERROR nova server = service.Service.create( 2026-04-23 00:39:34.073 112351 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:34.073 112351 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:34.073 112351 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:34.073 112351 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:34.073 112351 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:34.073 112351 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:34.073 112351 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:34.073 112351 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:34.073 112351 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:34.073 112351 ERROR nova self._client = self._create_client() 2026-04-23 00:39:34.073 112351 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:34.073 112351 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:34.073 112351 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:34.073 112351 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:34.073 112351 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:34.073 112351 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:34.073 112351 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:34.073 112351 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:34.073 112351 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:34.073 112351 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:34.073 112351 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:34.073 112351 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:34.073 112351 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:34.073 112351 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:34.073 112351 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:34.073 112351 ERROR nova 2026-04-23 00:39:36.872 112670 DEBUG oslo_db.sqlalchemy.engines [req-c9111c9e-c352-4e65-8c0e-4509440bac04 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:36.923 112670 DEBUG nova.scheduler.host_manager [req-c9111c9e-c352-4e65-8c0e-4509440bac04 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:36.923 112670 DEBUG nova.scheduler.host_manager [req-c9111c9e-c352-4e65-8c0e-4509440bac04 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:36.928 112670 WARNING nova.scheduler.filters.availability_zone_filter [req-c9111c9e-c352-4e65-8c0e-4509440bac04 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:37.038 112670 ERROR nova.scheduler.client.report [req-c9111c9e-c352-4e65-8c0e-4509440bac04 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:37.038 112670 CRITICAL nova [req-c9111c9e-c352-4e65-8c0e-4509440bac04 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:37.038 112670 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:37.038 112670 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:37.038 112670 ERROR nova sys.exit(main()) 2026-04-23 00:39:37.038 112670 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:37.038 112670 ERROR nova server = service.Service.create( 2026-04-23 00:39:37.038 112670 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:37.038 112670 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:37.038 112670 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:37.038 112670 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:37.038 112670 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:37.038 112670 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:37.038 112670 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:37.038 112670 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:37.038 112670 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:37.038 112670 ERROR nova self._client = self._create_client() 2026-04-23 00:39:37.038 112670 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:37.038 112670 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:37.038 112670 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:37.038 112670 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:37.038 112670 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:37.038 112670 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:37.038 112670 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:37.038 112670 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:37.038 112670 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:37.038 112670 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:37.038 112670 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:37.038 112670 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:37.038 112670 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:37.038 112670 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:37.038 112670 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:37.038 112670 ERROR nova 2026-04-23 00:39:39.731 113184 DEBUG oslo_db.sqlalchemy.engines [req-ab21ce4c-5d00-4845-8412-c41405af8b38 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:39.770 113184 DEBUG nova.scheduler.host_manager [req-ab21ce4c-5d00-4845-8412-c41405af8b38 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:39.770 113184 DEBUG nova.scheduler.host_manager [req-ab21ce4c-5d00-4845-8412-c41405af8b38 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:39.775 113184 WARNING nova.scheduler.filters.availability_zone_filter [req-ab21ce4c-5d00-4845-8412-c41405af8b38 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:39.868 113184 ERROR nova.scheduler.client.report [req-ab21ce4c-5d00-4845-8412-c41405af8b38 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:39.868 113184 CRITICAL nova [req-ab21ce4c-5d00-4845-8412-c41405af8b38 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:39.868 113184 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:39.868 113184 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:39.868 113184 ERROR nova sys.exit(main()) 2026-04-23 00:39:39.868 113184 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:39.868 113184 ERROR nova server = service.Service.create( 2026-04-23 00:39:39.868 113184 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:39.868 113184 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:39.868 113184 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:39.868 113184 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:39.868 113184 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:39.868 113184 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:39.868 113184 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:39.868 113184 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:39.868 113184 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:39.868 113184 ERROR nova self._client = self._create_client() 2026-04-23 00:39:39.868 113184 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:39.868 113184 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:39.868 113184 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:39.868 113184 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:39.868 113184 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:39.868 113184 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:39.868 113184 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:39.868 113184 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:39.868 113184 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:39.868 113184 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:39.868 113184 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:39.868 113184 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:39.868 113184 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:39.868 113184 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:39.868 113184 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:39.868 113184 ERROR nova 2026-04-23 00:39:42.511 113597 DEBUG oslo_db.sqlalchemy.engines [req-62667e4f-2121-4080-915a-fc5e37801c13 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:42.551 113597 DEBUG nova.scheduler.host_manager [req-62667e4f-2121-4080-915a-fc5e37801c13 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:42.552 113597 DEBUG nova.scheduler.host_manager [req-62667e4f-2121-4080-915a-fc5e37801c13 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:42.558 113597 WARNING nova.scheduler.filters.availability_zone_filter [req-62667e4f-2121-4080-915a-fc5e37801c13 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:42.666 113597 ERROR nova.scheduler.client.report [req-62667e4f-2121-4080-915a-fc5e37801c13 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:42.666 113597 CRITICAL nova [req-62667e4f-2121-4080-915a-fc5e37801c13 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:42.666 113597 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:42.666 113597 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:42.666 113597 ERROR nova sys.exit(main()) 2026-04-23 00:39:42.666 113597 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:42.666 113597 ERROR nova server = service.Service.create( 2026-04-23 00:39:42.666 113597 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:42.666 113597 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:42.666 113597 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:42.666 113597 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:42.666 113597 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:42.666 113597 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:42.666 113597 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:42.666 113597 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:42.666 113597 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:42.666 113597 ERROR nova self._client = self._create_client() 2026-04-23 00:39:42.666 113597 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:42.666 113597 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:42.666 113597 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:42.666 113597 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:42.666 113597 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:42.666 113597 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:42.666 113597 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:42.666 113597 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:42.666 113597 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:42.666 113597 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:42.666 113597 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:42.666 113597 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:42.666 113597 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:42.666 113597 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:42.666 113597 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:42.666 113597 ERROR nova 2026-04-23 00:39:45.287 114059 DEBUG oslo_db.sqlalchemy.engines [req-f4e77e7b-9af3-47cc-a5e9-901236c1eea7 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:45.334 114059 DEBUG nova.scheduler.host_manager [req-f4e77e7b-9af3-47cc-a5e9-901236c1eea7 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:45.335 114059 DEBUG nova.scheduler.host_manager [req-f4e77e7b-9af3-47cc-a5e9-901236c1eea7 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:45.339 114059 WARNING nova.scheduler.filters.availability_zone_filter [req-f4e77e7b-9af3-47cc-a5e9-901236c1eea7 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:45.434 114059 ERROR nova.scheduler.client.report [req-f4e77e7b-9af3-47cc-a5e9-901236c1eea7 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:45.435 114059 CRITICAL nova [req-f4e77e7b-9af3-47cc-a5e9-901236c1eea7 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:45.435 114059 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:45.435 114059 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:45.435 114059 ERROR nova sys.exit(main()) 2026-04-23 00:39:45.435 114059 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:45.435 114059 ERROR nova server = service.Service.create( 2026-04-23 00:39:45.435 114059 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:45.435 114059 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:45.435 114059 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:45.435 114059 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:45.435 114059 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:45.435 114059 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:45.435 114059 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:45.435 114059 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:45.435 114059 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:45.435 114059 ERROR nova self._client = self._create_client() 2026-04-23 00:39:45.435 114059 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:45.435 114059 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:45.435 114059 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:45.435 114059 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:45.435 114059 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:45.435 114059 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:45.435 114059 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:45.435 114059 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:45.435 114059 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:45.435 114059 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:45.435 114059 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:45.435 114059 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:45.435 114059 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:45.435 114059 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:45.435 114059 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:45.435 114059 ERROR nova 2026-04-23 00:39:48.000 114558 DEBUG oslo_db.sqlalchemy.engines [req-8eeb5b88-90a0-4c46-892a-34ee37e3735f - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:48.043 114558 DEBUG nova.scheduler.host_manager [req-8eeb5b88-90a0-4c46-892a-34ee37e3735f - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:48.043 114558 DEBUG nova.scheduler.host_manager [req-8eeb5b88-90a0-4c46-892a-34ee37e3735f - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:48.052 114558 WARNING nova.scheduler.filters.availability_zone_filter [req-8eeb5b88-90a0-4c46-892a-34ee37e3735f - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:48.140 114558 ERROR nova.scheduler.client.report [req-8eeb5b88-90a0-4c46-892a-34ee37e3735f - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:48.140 114558 CRITICAL nova [req-8eeb5b88-90a0-4c46-892a-34ee37e3735f - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:48.140 114558 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:48.140 114558 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:48.140 114558 ERROR nova sys.exit(main()) 2026-04-23 00:39:48.140 114558 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:48.140 114558 ERROR nova server = service.Service.create( 2026-04-23 00:39:48.140 114558 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:48.140 114558 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:48.140 114558 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:48.140 114558 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:48.140 114558 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:48.140 114558 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:48.140 114558 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:48.140 114558 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:48.140 114558 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:48.140 114558 ERROR nova self._client = self._create_client() 2026-04-23 00:39:48.140 114558 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:48.140 114558 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:48.140 114558 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:48.140 114558 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:48.140 114558 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:48.140 114558 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:48.140 114558 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:48.140 114558 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:48.140 114558 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:48.140 114558 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:48.140 114558 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:48.140 114558 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:48.140 114558 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:48.140 114558 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:48.140 114558 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:48.140 114558 ERROR nova 2026-04-23 00:39:50.652 115277 DEBUG oslo_db.sqlalchemy.engines [req-c51b86d1-47b5-42df-a155-79888ba40c91 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:50.690 115277 DEBUG nova.scheduler.host_manager [req-c51b86d1-47b5-42df-a155-79888ba40c91 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:50.690 115277 DEBUG nova.scheduler.host_manager [req-c51b86d1-47b5-42df-a155-79888ba40c91 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:50.695 115277 WARNING nova.scheduler.filters.availability_zone_filter [req-c51b86d1-47b5-42df-a155-79888ba40c91 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:50.793 115277 ERROR nova.scheduler.client.report [req-c51b86d1-47b5-42df-a155-79888ba40c91 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:50.793 115277 CRITICAL nova [req-c51b86d1-47b5-42df-a155-79888ba40c91 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:50.793 115277 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:50.793 115277 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:50.793 115277 ERROR nova sys.exit(main()) 2026-04-23 00:39:50.793 115277 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:50.793 115277 ERROR nova server = service.Service.create( 2026-04-23 00:39:50.793 115277 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:50.793 115277 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:50.793 115277 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:50.793 115277 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:50.793 115277 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:50.793 115277 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:50.793 115277 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:50.793 115277 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:50.793 115277 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:50.793 115277 ERROR nova self._client = self._create_client() 2026-04-23 00:39:50.793 115277 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:50.793 115277 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:50.793 115277 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:50.793 115277 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:50.793 115277 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:50.793 115277 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:50.793 115277 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:50.793 115277 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:50.793 115277 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:50.793 115277 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:50.793 115277 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:50.793 115277 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:50.793 115277 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:50.793 115277 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:50.793 115277 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:50.793 115277 ERROR nova 2026-04-23 00:39:53.029 115957 DEBUG oslo_db.sqlalchemy.engines [req-2dfebae2-c2ce-4277-9164-464445c784a3 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:53.077 115957 DEBUG nova.scheduler.host_manager [req-2dfebae2-c2ce-4277-9164-464445c784a3 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:53.078 115957 DEBUG nova.scheduler.host_manager [req-2dfebae2-c2ce-4277-9164-464445c784a3 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:53.083 115957 WARNING nova.scheduler.filters.availability_zone_filter [req-2dfebae2-c2ce-4277-9164-464445c784a3 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:53.195 115957 ERROR nova.scheduler.client.report [req-2dfebae2-c2ce-4277-9164-464445c784a3 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:53.196 115957 CRITICAL nova [req-2dfebae2-c2ce-4277-9164-464445c784a3 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:53.196 115957 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:53.196 115957 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:53.196 115957 ERROR nova sys.exit(main()) 2026-04-23 00:39:53.196 115957 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:53.196 115957 ERROR nova server = service.Service.create( 2026-04-23 00:39:53.196 115957 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:53.196 115957 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:53.196 115957 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:53.196 115957 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:53.196 115957 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:53.196 115957 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:53.196 115957 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:53.196 115957 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:53.196 115957 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:53.196 115957 ERROR nova self._client = self._create_client() 2026-04-23 00:39:53.196 115957 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:53.196 115957 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:53.196 115957 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:53.196 115957 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:53.196 115957 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:53.196 115957 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:53.196 115957 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:53.196 115957 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:53.196 115957 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:53.196 115957 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:53.196 115957 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:53.196 115957 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:53.196 115957 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:53.196 115957 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:53.196 115957 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:53.196 115957 ERROR nova 2026-04-23 00:39:55.609 116506 DEBUG oslo_db.sqlalchemy.engines [req-adc1df89-9a9d-48c1-811b-8dc10e0fa454 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:55.657 116506 DEBUG nova.scheduler.host_manager [req-adc1df89-9a9d-48c1-811b-8dc10e0fa454 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:55.657 116506 DEBUG nova.scheduler.host_manager [req-adc1df89-9a9d-48c1-811b-8dc10e0fa454 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:55.663 116506 WARNING nova.scheduler.filters.availability_zone_filter [req-adc1df89-9a9d-48c1-811b-8dc10e0fa454 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:55.751 116506 ERROR nova.scheduler.client.report [req-adc1df89-9a9d-48c1-811b-8dc10e0fa454 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:55.751 116506 CRITICAL nova [req-adc1df89-9a9d-48c1-811b-8dc10e0fa454 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:55.751 116506 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:55.751 116506 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:55.751 116506 ERROR nova sys.exit(main()) 2026-04-23 00:39:55.751 116506 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:55.751 116506 ERROR nova server = service.Service.create( 2026-04-23 00:39:55.751 116506 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:55.751 116506 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:55.751 116506 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:55.751 116506 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:55.751 116506 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:55.751 116506 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:55.751 116506 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:55.751 116506 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:55.751 116506 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:55.751 116506 ERROR nova self._client = self._create_client() 2026-04-23 00:39:55.751 116506 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:55.751 116506 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:55.751 116506 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:55.751 116506 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:55.751 116506 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:55.751 116506 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:55.751 116506 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:55.751 116506 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:55.751 116506 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:55.751 116506 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:55.751 116506 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:55.751 116506 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:55.751 116506 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:55.751 116506 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:55.751 116506 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:55.751 116506 ERROR nova 2026-04-23 00:39:58.189 116692 DEBUG oslo_db.sqlalchemy.engines [req-edceb159-50be-4102-b3a4-fec6d2ca86a9 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:39:58.237 116692 DEBUG nova.scheduler.host_manager [req-edceb159-50be-4102-b3a4-fec6d2ca86a9 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:39:58.238 116692 DEBUG nova.scheduler.host_manager [req-edceb159-50be-4102-b3a4-fec6d2ca86a9 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:39:58.242 116692 WARNING nova.scheduler.filters.availability_zone_filter [req-edceb159-50be-4102-b3a4-fec6d2ca86a9 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:39:58.329 116692 ERROR nova.scheduler.client.report [req-edceb159-50be-4102-b3a4-fec6d2ca86a9 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:58.330 116692 CRITICAL nova [req-edceb159-50be-4102-b3a4-fec6d2ca86a9 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:58.330 116692 ERROR nova Traceback (most recent call last): 2026-04-23 00:39:58.330 116692 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:39:58.330 116692 ERROR nova sys.exit(main()) 2026-04-23 00:39:58.330 116692 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:39:58.330 116692 ERROR nova server = service.Service.create( 2026-04-23 00:39:58.330 116692 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:39:58.330 116692 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:39:58.330 116692 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:39:58.330 116692 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:39:58.330 116692 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:39:58.330 116692 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:39:58.330 116692 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:39:58.330 116692 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:39:58.330 116692 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:39:58.330 116692 ERROR nova self._client = self._create_client() 2026-04-23 00:39:58.330 116692 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:39:58.330 116692 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:39:58.330 116692 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:39:58.330 116692 ERROR nova return getattr(conn, service_type) 2026-04-23 00:39:58.330 116692 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:39:58.330 116692 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:39:58.330 116692 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:39:58.330 116692 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:39:58.330 116692 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:39:58.330 116692 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:39:58.330 116692 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:39:58.330 116692 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:39:58.330 116692 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:39:58.330 116692 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:39:58.330 116692 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:39:58.330 116692 ERROR nova 2026-04-23 00:40:00.723 117514 DEBUG oslo_db.sqlalchemy.engines [req-ea4d45be-fc40-498f-96b4-a5d15e756244 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:00.819 117514 DEBUG nova.scheduler.host_manager [req-ea4d45be-fc40-498f-96b4-a5d15e756244 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:40:00.819 117514 DEBUG nova.scheduler.host_manager [req-ea4d45be-fc40-498f-96b4-a5d15e756244 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:40:00.824 117514 WARNING nova.scheduler.filters.availability_zone_filter [req-ea4d45be-fc40-498f-96b4-a5d15e756244 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:40:00.942 117514 ERROR nova.scheduler.client.report [req-ea4d45be-fc40-498f-96b4-a5d15e756244 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:00.943 117514 CRITICAL nova [req-ea4d45be-fc40-498f-96b4-a5d15e756244 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:00.943 117514 ERROR nova Traceback (most recent call last): 2026-04-23 00:40:00.943 117514 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:40:00.943 117514 ERROR nova sys.exit(main()) 2026-04-23 00:40:00.943 117514 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:40:00.943 117514 ERROR nova server = service.Service.create( 2026-04-23 00:40:00.943 117514 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:40:00.943 117514 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:40:00.943 117514 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:40:00.943 117514 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:40:00.943 117514 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:40:00.943 117514 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:40:00.943 117514 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:40:00.943 117514 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:40:00.943 117514 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:40:00.943 117514 ERROR nova self._client = self._create_client() 2026-04-23 00:40:00.943 117514 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:40:00.943 117514 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:40:00.943 117514 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:40:00.943 117514 ERROR nova return getattr(conn, service_type) 2026-04-23 00:40:00.943 117514 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:40:00.943 117514 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:40:00.943 117514 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:40:00.943 117514 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:40:00.943 117514 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:40:00.943 117514 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:40:00.943 117514 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:40:00.943 117514 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:40:00.943 117514 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:40:00.943 117514 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:40:00.943 117514 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:00.943 117514 ERROR nova 2026-04-23 00:40:03.241 118266 DEBUG oslo_db.sqlalchemy.engines [req-dfaf59d7-bbe5-4e9a-a4e2-b165a72b3758 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:03.274 118266 DEBUG nova.scheduler.host_manager [req-dfaf59d7-bbe5-4e9a-a4e2-b165a72b3758 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:40:03.275 118266 DEBUG nova.scheduler.host_manager [req-dfaf59d7-bbe5-4e9a-a4e2-b165a72b3758 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:40:03.279 118266 WARNING nova.scheduler.filters.availability_zone_filter [req-dfaf59d7-bbe5-4e9a-a4e2-b165a72b3758 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:40:03.363 118266 ERROR nova.scheduler.client.report [req-dfaf59d7-bbe5-4e9a-a4e2-b165a72b3758 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:03.364 118266 CRITICAL nova [req-dfaf59d7-bbe5-4e9a-a4e2-b165a72b3758 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:03.364 118266 ERROR nova Traceback (most recent call last): 2026-04-23 00:40:03.364 118266 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:40:03.364 118266 ERROR nova sys.exit(main()) 2026-04-23 00:40:03.364 118266 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:40:03.364 118266 ERROR nova server = service.Service.create( 2026-04-23 00:40:03.364 118266 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:40:03.364 118266 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:40:03.364 118266 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:40:03.364 118266 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:40:03.364 118266 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:40:03.364 118266 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:40:03.364 118266 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:40:03.364 118266 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:40:03.364 118266 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:40:03.364 118266 ERROR nova self._client = self._create_client() 2026-04-23 00:40:03.364 118266 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:40:03.364 118266 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:40:03.364 118266 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:40:03.364 118266 ERROR nova return getattr(conn, service_type) 2026-04-23 00:40:03.364 118266 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:40:03.364 118266 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:40:03.364 118266 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:40:03.364 118266 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:40:03.364 118266 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:40:03.364 118266 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:40:03.364 118266 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:40:03.364 118266 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:40:03.364 118266 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:40:03.364 118266 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:40:03.364 118266 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:03.364 118266 ERROR nova 2026-04-23 00:40:05.776 119025 DEBUG oslo_db.sqlalchemy.engines [req-f7d3385b-2398-4a06-82c2-147e1a93f8e9 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:05.832 119025 DEBUG nova.scheduler.host_manager [req-f7d3385b-2398-4a06-82c2-147e1a93f8e9 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:40:05.832 119025 DEBUG nova.scheduler.host_manager [req-f7d3385b-2398-4a06-82c2-147e1a93f8e9 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:40:05.836 119025 WARNING nova.scheduler.filters.availability_zone_filter [req-f7d3385b-2398-4a06-82c2-147e1a93f8e9 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:40:05.934 119025 ERROR nova.scheduler.client.report [req-f7d3385b-2398-4a06-82c2-147e1a93f8e9 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:05.934 119025 CRITICAL nova [req-f7d3385b-2398-4a06-82c2-147e1a93f8e9 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:05.934 119025 ERROR nova Traceback (most recent call last): 2026-04-23 00:40:05.934 119025 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:40:05.934 119025 ERROR nova sys.exit(main()) 2026-04-23 00:40:05.934 119025 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:40:05.934 119025 ERROR nova server = service.Service.create( 2026-04-23 00:40:05.934 119025 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:40:05.934 119025 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:40:05.934 119025 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:40:05.934 119025 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:40:05.934 119025 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:40:05.934 119025 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:40:05.934 119025 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:40:05.934 119025 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:40:05.934 119025 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:40:05.934 119025 ERROR nova self._client = self._create_client() 2026-04-23 00:40:05.934 119025 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:40:05.934 119025 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:40:05.934 119025 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:40:05.934 119025 ERROR nova return getattr(conn, service_type) 2026-04-23 00:40:05.934 119025 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:40:05.934 119025 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:40:05.934 119025 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:40:05.934 119025 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:40:05.934 119025 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:40:05.934 119025 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:40:05.934 119025 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:40:05.934 119025 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:40:05.934 119025 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:40:05.934 119025 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:40:05.934 119025 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:05.934 119025 ERROR nova 2026-04-23 00:40:08.268 119993 DEBUG oslo_db.sqlalchemy.engines [req-718e69fa-d7dc-4126-b395-457001c33b84 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:08.301 119993 DEBUG nova.scheduler.host_manager [req-718e69fa-d7dc-4126-b395-457001c33b84 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:40:08.301 119993 DEBUG nova.scheduler.host_manager [req-718e69fa-d7dc-4126-b395-457001c33b84 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:40:08.305 119993 WARNING nova.scheduler.filters.availability_zone_filter [req-718e69fa-d7dc-4126-b395-457001c33b84 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:40:08.386 119993 ERROR nova.scheduler.client.report [req-718e69fa-d7dc-4126-b395-457001c33b84 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:08.387 119993 CRITICAL nova [req-718e69fa-d7dc-4126-b395-457001c33b84 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:08.387 119993 ERROR nova Traceback (most recent call last): 2026-04-23 00:40:08.387 119993 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:40:08.387 119993 ERROR nova sys.exit(main()) 2026-04-23 00:40:08.387 119993 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:40:08.387 119993 ERROR nova server = service.Service.create( 2026-04-23 00:40:08.387 119993 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:40:08.387 119993 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:40:08.387 119993 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:40:08.387 119993 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:40:08.387 119993 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:40:08.387 119993 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:40:08.387 119993 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:40:08.387 119993 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:40:08.387 119993 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:40:08.387 119993 ERROR nova self._client = self._create_client() 2026-04-23 00:40:08.387 119993 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:40:08.387 119993 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:40:08.387 119993 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:40:08.387 119993 ERROR nova return getattr(conn, service_type) 2026-04-23 00:40:08.387 119993 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:40:08.387 119993 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:40:08.387 119993 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:40:08.387 119993 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:40:08.387 119993 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:40:08.387 119993 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:40:08.387 119993 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:40:08.387 119993 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:40:08.387 119993 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:40:08.387 119993 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:40:08.387 119993 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:08.387 119993 ERROR nova 2026-04-23 00:40:10.734 120900 DEBUG oslo_db.sqlalchemy.engines [req-8517f508-a8e4-418c-873d-566d717fdd4a - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:10.768 120900 DEBUG nova.scheduler.host_manager [req-8517f508-a8e4-418c-873d-566d717fdd4a - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:40:10.768 120900 DEBUG nova.scheduler.host_manager [req-8517f508-a8e4-418c-873d-566d717fdd4a - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:40:10.772 120900 WARNING nova.scheduler.filters.availability_zone_filter [req-8517f508-a8e4-418c-873d-566d717fdd4a - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:40:10.849 120900 ERROR nova.scheduler.client.report [req-8517f508-a8e4-418c-873d-566d717fdd4a - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:10.849 120900 CRITICAL nova [req-8517f508-a8e4-418c-873d-566d717fdd4a - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:10.849 120900 ERROR nova Traceback (most recent call last): 2026-04-23 00:40:10.849 120900 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:40:10.849 120900 ERROR nova sys.exit(main()) 2026-04-23 00:40:10.849 120900 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:40:10.849 120900 ERROR nova server = service.Service.create( 2026-04-23 00:40:10.849 120900 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:40:10.849 120900 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:40:10.849 120900 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:40:10.849 120900 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:40:10.849 120900 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:40:10.849 120900 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:40:10.849 120900 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:40:10.849 120900 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:40:10.849 120900 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:40:10.849 120900 ERROR nova self._client = self._create_client() 2026-04-23 00:40:10.849 120900 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:40:10.849 120900 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:40:10.849 120900 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:40:10.849 120900 ERROR nova return getattr(conn, service_type) 2026-04-23 00:40:10.849 120900 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:40:10.849 120900 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:40:10.849 120900 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:40:10.849 120900 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:40:10.849 120900 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:40:10.849 120900 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:40:10.849 120900 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:40:10.849 120900 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:40:10.849 120900 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:40:10.849 120900 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:40:10.849 120900 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:10.849 120900 ERROR nova 2026-04-23 00:40:13.101 121281 DEBUG oslo_db.sqlalchemy.engines [req-99229f4e-dc2a-42f7-b54b-38892e59bd2d - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:14.706 121705 DEBUG oslo_db.sqlalchemy.engines [req-2ea5d256-4b3a-474c-af06-750159de94b0 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:14.745 121705 DEBUG nova.scheduler.host_manager [req-2ea5d256-4b3a-474c-af06-750159de94b0 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:40:14.746 121705 DEBUG nova.scheduler.host_manager [req-2ea5d256-4b3a-474c-af06-750159de94b0 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:40:14.750 121705 WARNING nova.scheduler.filters.availability_zone_filter [req-2ea5d256-4b3a-474c-af06-750159de94b0 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:40:14.829 121705 ERROR nova.scheduler.client.report [req-2ea5d256-4b3a-474c-af06-750159de94b0 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:14.830 121705 CRITICAL nova [req-2ea5d256-4b3a-474c-af06-750159de94b0 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:14.830 121705 ERROR nova Traceback (most recent call last): 2026-04-23 00:40:14.830 121705 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:40:14.830 121705 ERROR nova sys.exit(main()) 2026-04-23 00:40:14.830 121705 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:40:14.830 121705 ERROR nova server = service.Service.create( 2026-04-23 00:40:14.830 121705 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:40:14.830 121705 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:40:14.830 121705 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:40:14.830 121705 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:40:14.830 121705 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:40:14.830 121705 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:40:14.830 121705 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:40:14.830 121705 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:40:14.830 121705 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:40:14.830 121705 ERROR nova self._client = self._create_client() 2026-04-23 00:40:14.830 121705 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:40:14.830 121705 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:40:14.830 121705 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:40:14.830 121705 ERROR nova return getattr(conn, service_type) 2026-04-23 00:40:14.830 121705 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:40:14.830 121705 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:40:14.830 121705 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:40:14.830 121705 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:40:14.830 121705 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:40:14.830 121705 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:40:14.830 121705 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:40:14.830 121705 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:40:14.830 121705 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:40:14.830 121705 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:40:14.830 121705 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:14.830 121705 ERROR nova 2026-04-23 00:40:17.165 122124 DEBUG oslo_db.sqlalchemy.engines [req-3b0b354c-55cd-4e10-a6d0-f02b03040b9a - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:17.203 122124 DEBUG nova.scheduler.host_manager [req-3b0b354c-55cd-4e10-a6d0-f02b03040b9a - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:40:17.204 122124 DEBUG nova.scheduler.host_manager [req-3b0b354c-55cd-4e10-a6d0-f02b03040b9a - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:40:17.207 122124 WARNING nova.scheduler.filters.availability_zone_filter [req-3b0b354c-55cd-4e10-a6d0-f02b03040b9a - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:40:17.284 122124 ERROR nova.scheduler.client.report [req-3b0b354c-55cd-4e10-a6d0-f02b03040b9a - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:17.285 122124 CRITICAL nova [req-3b0b354c-55cd-4e10-a6d0-f02b03040b9a - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:17.285 122124 ERROR nova Traceback (most recent call last): 2026-04-23 00:40:17.285 122124 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:40:17.285 122124 ERROR nova sys.exit(main()) 2026-04-23 00:40:17.285 122124 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:40:17.285 122124 ERROR nova server = service.Service.create( 2026-04-23 00:40:17.285 122124 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:40:17.285 122124 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:40:17.285 122124 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:40:17.285 122124 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:40:17.285 122124 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:40:17.285 122124 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:40:17.285 122124 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:40:17.285 122124 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:40:17.285 122124 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:40:17.285 122124 ERROR nova self._client = self._create_client() 2026-04-23 00:40:17.285 122124 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:40:17.285 122124 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:40:17.285 122124 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:40:17.285 122124 ERROR nova return getattr(conn, service_type) 2026-04-23 00:40:17.285 122124 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:40:17.285 122124 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:40:17.285 122124 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:40:17.285 122124 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:40:17.285 122124 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:40:17.285 122124 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:40:17.285 122124 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:40:17.285 122124 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:40:17.285 122124 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:40:17.285 122124 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:40:17.285 122124 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:17.285 122124 ERROR nova 2026-04-23 00:40:19.446 122549 DEBUG oslo_db.sqlalchemy.engines [req-88d3316a-56ec-488f-ad37-ec16008416aa - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:19.481 122549 DEBUG nova.scheduler.host_manager [req-88d3316a-56ec-488f-ad37-ec16008416aa - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:40:19.481 122549 DEBUG nova.scheduler.host_manager [req-88d3316a-56ec-488f-ad37-ec16008416aa - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:40:19.486 122549 WARNING nova.scheduler.filters.availability_zone_filter [req-88d3316a-56ec-488f-ad37-ec16008416aa - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:40:19.580 122549 ERROR nova.scheduler.client.report [req-88d3316a-56ec-488f-ad37-ec16008416aa - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:19.581 122549 CRITICAL nova [req-88d3316a-56ec-488f-ad37-ec16008416aa - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:19.581 122549 ERROR nova Traceback (most recent call last): 2026-04-23 00:40:19.581 122549 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:40:19.581 122549 ERROR nova sys.exit(main()) 2026-04-23 00:40:19.581 122549 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:40:19.581 122549 ERROR nova server = service.Service.create( 2026-04-23 00:40:19.581 122549 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:40:19.581 122549 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:40:19.581 122549 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:40:19.581 122549 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:40:19.581 122549 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:40:19.581 122549 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:40:19.581 122549 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:40:19.581 122549 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:40:19.581 122549 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:40:19.581 122549 ERROR nova self._client = self._create_client() 2026-04-23 00:40:19.581 122549 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:40:19.581 122549 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:40:19.581 122549 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:40:19.581 122549 ERROR nova return getattr(conn, service_type) 2026-04-23 00:40:19.581 122549 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:40:19.581 122549 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:40:19.581 122549 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:40:19.581 122549 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:40:19.581 122549 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:40:19.581 122549 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:40:19.581 122549 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:40:19.581 122549 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:40:19.581 122549 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:40:19.581 122549 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:40:19.581 122549 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:19.581 122549 ERROR nova 2026-04-23 00:40:21.883 123338 DEBUG oslo_db.sqlalchemy.engines [req-9ecea3f0-8c27-40ad-b119-d552cb1fe67e - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:21.913 123338 DEBUG nova.scheduler.host_manager [req-9ecea3f0-8c27-40ad-b119-d552cb1fe67e - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:40:21.913 123338 DEBUG nova.scheduler.host_manager [req-9ecea3f0-8c27-40ad-b119-d552cb1fe67e - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:40:21.917 123338 WARNING nova.scheduler.filters.availability_zone_filter [req-9ecea3f0-8c27-40ad-b119-d552cb1fe67e - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:40:21.993 123338 ERROR nova.scheduler.client.report [req-9ecea3f0-8c27-40ad-b119-d552cb1fe67e - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:21.994 123338 CRITICAL nova [req-9ecea3f0-8c27-40ad-b119-d552cb1fe67e - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:21.994 123338 ERROR nova Traceback (most recent call last): 2026-04-23 00:40:21.994 123338 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:40:21.994 123338 ERROR nova sys.exit(main()) 2026-04-23 00:40:21.994 123338 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:40:21.994 123338 ERROR nova server = service.Service.create( 2026-04-23 00:40:21.994 123338 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:40:21.994 123338 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:40:21.994 123338 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:40:21.994 123338 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:40:21.994 123338 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:40:21.994 123338 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:40:21.994 123338 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:40:21.994 123338 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:40:21.994 123338 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:40:21.994 123338 ERROR nova self._client = self._create_client() 2026-04-23 00:40:21.994 123338 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:40:21.994 123338 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:40:21.994 123338 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:40:21.994 123338 ERROR nova return getattr(conn, service_type) 2026-04-23 00:40:21.994 123338 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:40:21.994 123338 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:40:21.994 123338 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:40:21.994 123338 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:40:21.994 123338 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:40:21.994 123338 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:40:21.994 123338 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:40:21.994 123338 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:40:21.994 123338 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:40:21.994 123338 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:40:21.994 123338 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:21.994 123338 ERROR nova 2026-04-23 00:40:24.071 124118 DEBUG oslo_db.sqlalchemy.engines [req-76174cd7-562e-4325-9548-e8bf28ccb6c9 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:24.108 124118 DEBUG nova.scheduler.host_manager [req-76174cd7-562e-4325-9548-e8bf28ccb6c9 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:40:24.108 124118 DEBUG nova.scheduler.host_manager [req-76174cd7-562e-4325-9548-e8bf28ccb6c9 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:40:24.113 124118 WARNING nova.scheduler.filters.availability_zone_filter [req-76174cd7-562e-4325-9548-e8bf28ccb6c9 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:40:24.193 124118 ERROR nova.scheduler.client.report [req-76174cd7-562e-4325-9548-e8bf28ccb6c9 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:24.193 124118 CRITICAL nova [req-76174cd7-562e-4325-9548-e8bf28ccb6c9 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:24.193 124118 ERROR nova Traceback (most recent call last): 2026-04-23 00:40:24.193 124118 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:40:24.193 124118 ERROR nova sys.exit(main()) 2026-04-23 00:40:24.193 124118 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:40:24.193 124118 ERROR nova server = service.Service.create( 2026-04-23 00:40:24.193 124118 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:40:24.193 124118 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:40:24.193 124118 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:40:24.193 124118 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:40:24.193 124118 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:40:24.193 124118 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:40:24.193 124118 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:40:24.193 124118 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:40:24.193 124118 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:40:24.193 124118 ERROR nova self._client = self._create_client() 2026-04-23 00:40:24.193 124118 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:40:24.193 124118 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:40:24.193 124118 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:40:24.193 124118 ERROR nova return getattr(conn, service_type) 2026-04-23 00:40:24.193 124118 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:40:24.193 124118 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:40:24.193 124118 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:40:24.193 124118 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:40:24.193 124118 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:40:24.193 124118 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:40:24.193 124118 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:40:24.193 124118 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:40:24.193 124118 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:40:24.193 124118 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:40:24.193 124118 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:24.193 124118 ERROR nova 2026-04-23 00:40:26.396 125167 DEBUG oslo_db.sqlalchemy.engines [req-20dc2ed3-2ca4-4797-81dc-0c9cc680323f - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:26.445 125167 DEBUG nova.scheduler.host_manager [req-20dc2ed3-2ca4-4797-81dc-0c9cc680323f - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:40:26.445 125167 DEBUG nova.scheduler.host_manager [req-20dc2ed3-2ca4-4797-81dc-0c9cc680323f - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:40:26.449 125167 WARNING nova.scheduler.filters.availability_zone_filter [req-20dc2ed3-2ca4-4797-81dc-0c9cc680323f - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:40:26.532 125167 ERROR nova.scheduler.client.report [req-20dc2ed3-2ca4-4797-81dc-0c9cc680323f - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:26.533 125167 CRITICAL nova [req-20dc2ed3-2ca4-4797-81dc-0c9cc680323f - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:26.533 125167 ERROR nova Traceback (most recent call last): 2026-04-23 00:40:26.533 125167 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:40:26.533 125167 ERROR nova sys.exit(main()) 2026-04-23 00:40:26.533 125167 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:40:26.533 125167 ERROR nova server = service.Service.create( 2026-04-23 00:40:26.533 125167 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:40:26.533 125167 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:40:26.533 125167 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:40:26.533 125167 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:40:26.533 125167 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:40:26.533 125167 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:40:26.533 125167 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:40:26.533 125167 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:40:26.533 125167 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:40:26.533 125167 ERROR nova self._client = self._create_client() 2026-04-23 00:40:26.533 125167 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:40:26.533 125167 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:40:26.533 125167 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:40:26.533 125167 ERROR nova return getattr(conn, service_type) 2026-04-23 00:40:26.533 125167 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:40:26.533 125167 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:40:26.533 125167 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:40:26.533 125167 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:40:26.533 125167 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:40:26.533 125167 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:40:26.533 125167 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:40:26.533 125167 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:40:26.533 125167 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:40:26.533 125167 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:40:26.533 125167 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:26.533 125167 ERROR nova 2026-04-23 00:40:28.590 126153 DEBUG oslo_db.sqlalchemy.engines [req-73f6200c-f993-40a0-95f2-634380712a26 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:28.622 126153 DEBUG nova.scheduler.host_manager [req-73f6200c-f993-40a0-95f2-634380712a26 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:40:28.622 126153 DEBUG nova.scheduler.host_manager [req-73f6200c-f993-40a0-95f2-634380712a26 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:40:28.626 126153 WARNING nova.scheduler.filters.availability_zone_filter [req-73f6200c-f993-40a0-95f2-634380712a26 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:40:28.706 126153 ERROR nova.scheduler.client.report [req-73f6200c-f993-40a0-95f2-634380712a26 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:28.707 126153 CRITICAL nova [req-73f6200c-f993-40a0-95f2-634380712a26 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:28.707 126153 ERROR nova Traceback (most recent call last): 2026-04-23 00:40:28.707 126153 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:40:28.707 126153 ERROR nova sys.exit(main()) 2026-04-23 00:40:28.707 126153 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:40:28.707 126153 ERROR nova server = service.Service.create( 2026-04-23 00:40:28.707 126153 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:40:28.707 126153 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:40:28.707 126153 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:40:28.707 126153 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:40:28.707 126153 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:40:28.707 126153 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:40:28.707 126153 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:40:28.707 126153 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:40:28.707 126153 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:40:28.707 126153 ERROR nova self._client = self._create_client() 2026-04-23 00:40:28.707 126153 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:40:28.707 126153 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:40:28.707 126153 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:40:28.707 126153 ERROR nova return getattr(conn, service_type) 2026-04-23 00:40:28.707 126153 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:40:28.707 126153 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:40:28.707 126153 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:40:28.707 126153 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:40:28.707 126153 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:40:28.707 126153 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:40:28.707 126153 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:40:28.707 126153 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:40:28.707 126153 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:40:28.707 126153 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:40:28.707 126153 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:28.707 126153 ERROR nova 2026-04-23 00:40:30.836 126981 DEBUG oslo_db.sqlalchemy.engines [req-82511fe9-c91f-4627-9464-cfaf308014a2 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:30.867 126981 DEBUG nova.scheduler.host_manager [req-82511fe9-c91f-4627-9464-cfaf308014a2 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:40:30.868 126981 DEBUG nova.scheduler.host_manager [req-82511fe9-c91f-4627-9464-cfaf308014a2 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:40:30.873 126981 WARNING nova.scheduler.filters.availability_zone_filter [req-82511fe9-c91f-4627-9464-cfaf308014a2 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:40:30.951 126981 ERROR nova.scheduler.client.report [req-82511fe9-c91f-4627-9464-cfaf308014a2 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:30.952 126981 CRITICAL nova [req-82511fe9-c91f-4627-9464-cfaf308014a2 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:30.952 126981 ERROR nova Traceback (most recent call last): 2026-04-23 00:40:30.952 126981 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-23 00:40:30.952 126981 ERROR nova sys.exit(main()) 2026-04-23 00:40:30.952 126981 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-23 00:40:30.952 126981 ERROR nova server = service.Service.create( 2026-04-23 00:40:30.952 126981 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-23 00:40:30.952 126981 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-23 00:40:30.952 126981 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-23 00:40:30.952 126981 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-23 00:40:30.952 126981 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-23 00:40:30.952 126981 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-23 00:40:30.952 126981 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-23 00:40:30.952 126981 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-23 00:40:30.952 126981 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-23 00:40:30.952 126981 ERROR nova self._client = self._create_client() 2026-04-23 00:40:30.952 126981 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-23 00:40:30.952 126981 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-23 00:40:30.952 126981 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-23 00:40:30.952 126981 ERROR nova return getattr(conn, service_type) 2026-04-23 00:40:30.952 126981 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-23 00:40:30.952 126981 ERROR nova proxy = self._make_proxy(instance) 2026-04-23 00:40:30.952 126981 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-23 00:40:30.952 126981 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-23 00:40:30.952 126981 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-23 00:40:30.952 126981 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-23 00:40:30.952 126981 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-23 00:40:30.952 126981 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-23 00:40:30.952 126981 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-23 00:40:30.952 126981 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-23 00:40:30.952 126981 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-23 00:40:30.952 126981 ERROR nova 2026-04-23 00:40:33.062 127783 DEBUG oslo_db.sqlalchemy.engines [req-52bd205d-3eef-4206-863b-080b554ed01c - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:33.094 127783 DEBUG nova.scheduler.host_manager [req-52bd205d-3eef-4206-863b-080b554ed01c - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:40:33.095 127783 DEBUG nova.scheduler.host_manager [req-52bd205d-3eef-4206-863b-080b554ed01c - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:40:33.100 127783 WARNING nova.scheduler.filters.availability_zone_filter [req-52bd205d-3eef-4206-863b-080b554ed01c - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:40:33.181 127783 DEBUG nova.scheduler.host_manager [req-52bd205d-3eef-4206-863b-080b554ed01c - - - - -] START:_async_init_instance_info _async_init_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:411 2026-04-23 00:40:33.183 127783 DEBUG oslo_concurrency.lockutils [req-d524a4d4-bb45-450c-ba45-453bbe167d88 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:40:33.184 127783 DEBUG oslo_concurrency.lockutils [req-d524a4d4-bb45-450c-ba45-453bbe167d88 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:40:33.190 127783 DEBUG oslo_db.sqlalchemy.engines [req-d524a4d4-bb45-450c-ba45-453bbe167d88 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:33.202 127783 DEBUG nova.scheduler.host_manager [req-d524a4d4-bb45-450c-ba45-453bbe167d88 - - - - -] Total number of compute nodes: 0 _async_init_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:424 2026-04-23 00:40:33.203 127783 DEBUG oslo_concurrency.lockutils [req-d524a4d4-bb45-450c-ba45-453bbe167d88 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:40:33.203 127783 DEBUG oslo_concurrency.lockutils [req-d524a4d4-bb45-450c-ba45-453bbe167d88 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:40:33.258 127783 DEBUG nova.scheduler.host_manager [req-d524a4d4-bb45-450c-ba45-453bbe167d88 - - - - -] Adding 0 instances for hosts 10-20 _async_init_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:443 2026-04-23 00:40:33.259 127783 DEBUG nova.scheduler.host_manager [req-d524a4d4-bb45-450c-ba45-453bbe167d88 - - - - -] END:_async_init_instance_info _async_init_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:454 2026-04-23 00:40:34.993 127783 DEBUG nova.context [req-52bd205d-3eef-4206-863b-080b554ed01c - - - - -] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),de8a22a1-b255-45ae-baf4-856041bc1c3f(cell1) load_cells /usr/lib/python3/dist-packages/nova/context.py:464 2026-04-23 00:40:34.994 127783 DEBUG oslo_concurrency.lockutils [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:40:34.995 127783 DEBUG oslo_concurrency.lockutils [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:40:34.995 127783 DEBUG oslo_concurrency.lockutils [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:40:34.996 127783 DEBUG oslo_concurrency.lockutils [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:40:35.010 127783 DEBUG oslo_db.sqlalchemy.engines [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:35.018 127783 DEBUG oslo_concurrency.lockutils [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:294 2026-04-23 00:40:35.019 127783 DEBUG oslo_concurrency.lockutils [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:312 2026-04-23 00:40:35.019 127783 INFO oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Starting 4 workers 2026-04-23 00:40:35.023 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Started child 129199 _start_child /usr/lib/python3/dist-packages/oslo_service/service.py:575 2026-04-23 00:40:35.027 129199 INFO nova.service [-] Starting scheduler node (version 25.2.1) 2026-04-23 00:40:35.028 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Started child 129200 _start_child /usr/lib/python3/dist-packages/oslo_service/service.py:575 2026-04-23 00:40:35.033 129200 INFO nova.service [-] Starting scheduler node (version 25.2.1) 2026-04-23 00:40:35.032 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Started child 129201 _start_child /usr/lib/python3/dist-packages/oslo_service/service.py:575 2026-04-23 00:40:35.037 129201 INFO nova.service [-] Starting scheduler node (version 25.2.1) 2026-04-23 00:40:35.036 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Started child 129202 _start_child /usr/lib/python3/dist-packages/oslo_service/service.py:575 2026-04-23 00:40:35.038 129199 DEBUG oslo_db.sqlalchemy.engines [req-f190fc88-ef2f-40c3-973c-af21f6be856a - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:35.039 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Full set of CONF: wait /usr/lib/python3/dist-packages/oslo_service/service.py:649 2026-04-23 00:40:35.039 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2589 2026-04-23 00:40:35.039 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2590 2026-04-23 00:40:35.039 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] command line args: ['--config-file=/etc/nova/nova.conf', '--log-file=/var/log/nova/nova-scheduler.log'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2591 2026-04-23 00:40:35.040 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] config files: ['/etc/nova/nova.conf'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2592 2026-04-23 00:40:35.040 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ================================================================================ log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2594 2026-04-23 00:40:35.040 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.040 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.040 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.041 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.041 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cert = self.pem log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.041 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.041 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.042 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] config_dir = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.041 129202 INFO nova.service [-] Starting scheduler node (version 25.2.1) 2026-04-23 00:40:35.042 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.042 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] config_file = ['/etc/nova/nova.conf'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.042 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] config_source = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.043 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] console_host = juju-0097f2-0-lxd-7 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.043 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] control_exchange = nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.043 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cpu_allocation_ratio = 2.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.043 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] daemon = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.044 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] debug = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.044 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.044 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.044 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.044 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=DEBUG', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.045 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.045 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] disk_allocation_ratio = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.045 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] enable_new_services = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.045 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.045 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.045 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] fatal_deprecations = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.046 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] flat_injected = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.046 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] force_config_drive = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.046 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] force_raw_images = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.046 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.046 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.046 129200 DEBUG oslo_db.sqlalchemy.engines [req-ce0be29e-7def-4a81-91da-5a8b64c8aba2 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:35.047 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] host = juju-0097f2-0-lxd-7 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.047 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] initial_cpu_allocation_ratio = 16.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.047 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] initial_disk_allocation_ratio = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.047 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] initial_ram_allocation_ratio = 1.5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.048 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] injected_network_template = /usr/lib/python3/dist-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.048 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.048 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.048 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.049 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.049 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.049 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.049 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.050 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.050 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.050 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] key = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.050 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.050 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] log_config_append = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.051 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.050 129201 DEBUG oslo_db.sqlalchemy.engines [req-823825d8-c851-4bfa-b486-95fcef5c691f - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:35.051 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] log_dir = /var/log/nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.051 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] log_file = /var/log/nova/nova-scheduler.log log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.051 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] log_options = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.051 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.052 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.052 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] log_rotation_type = none log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.052 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.052 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.052 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.052 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.053 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.053 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.053 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.053 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.053 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.054 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.054 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] max_logfile_count = 30 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.054 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.054 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.054 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.055 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] metadata_listen_port = 8765 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.055 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] metadata_workers = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.055 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.055 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] mkisofs_cmd = genisoimage log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.055 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] my_block_storage_ip = 252.129.195.220 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.055 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] my_ip = 252.129.195.220 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.056 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.055 129202 DEBUG oslo_db.sqlalchemy.engines [req-97bfe95c-3e01-452f-b08c-16a815bd3f04 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:40:35.056 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.056 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.056 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] osapi_compute_listen_port = 8764 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.056 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.056 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] osapi_compute_workers = 4 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.057 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] password_length = 12 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.057 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] periodic_enable = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.057 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.057 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.057 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] preallocate_images = none log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.058 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] publish_errors = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.058 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] pybasedir = /usr/lib/python3/dist-packages log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.058 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ram_allocation_ratio = 0.98 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.058 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.058 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.059 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.059 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.059 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.059 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] record = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.059 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] report_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.060 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.060 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.060 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.060 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.060 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.060 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.061 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.061 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.061 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.063 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.063 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.063 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.063 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.064 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.064 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.064 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.065 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.066 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.066 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.066 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.066 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ssl_only = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.066 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.067 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.067 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.067 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.068 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] tempdir = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.068 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.068 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] transport_url = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.069 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.070 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] use_cow_images = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.070 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] use_eventlog = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.070 129199 DEBUG nova.service [req-f190fc88-ef2f-40c3-973c-af21f6be856a - - - - -] Creating RPC server for service scheduler start /usr/lib/python3/dist-packages/nova/service.py:182 2026-04-23 00:40:35.070 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] use_journal = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.072 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] use_json = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.072 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.072 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] use_stderr = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.072 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] use_syslog = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.073 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.073 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.073 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.073 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.073 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.074 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] watch_log_file = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.074 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:40:35.075 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.075 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_concurrency.lock_path = /var/lock/nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.076 127783 WARNING oslo_config.cfg [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Deprecated: Option "auth_strategy" from group "api" is deprecated for removal ( The only non-default choice, ``noauth2``, is for internal development and testing purposes only and should not be used in deployments. This option and its middleware, NoAuthMiddleware[V2_18], will be removed in a future release. ). Its value may be silently ignored in the future. 2026-04-23 00:40:35.076 129200 DEBUG nova.service [req-ce0be29e-7def-4a81-91da-5a8b64c8aba2 - - - - -] Creating RPC server for service scheduler start /usr/lib/python3/dist-packages/nova/service.py:182 2026-04-23 00:40:35.077 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.077 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.077 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.078 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.dhcp_domain = novalocal log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.078 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.078 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.078 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.079 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.080 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.080 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.080 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.081 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.081 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.081 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.082 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.082 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.082 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.083 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.083 129201 DEBUG nova.service [req-823825d8-c851-4bfa-b486-95fcef5c691f - - - - -] Creating RPC server for service scheduler start /usr/lib/python3/dist-packages/nova/service.py:182 2026-04-23 00:40:35.083 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.083 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.084 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.084 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.084 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.086 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.backend = dogpile.cache.null log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.086 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.087 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.087 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.087 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.088 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.088 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.088 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.088 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.088 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.089 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.089 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.089 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.089 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.089 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.090 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.090 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.090 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.090 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.091 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.091 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cinder.auth_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.091 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.091 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cinder.catalog_info = volumev3::publicURL log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.091 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.092 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.092 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.092 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cinder.debug = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.092 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.092 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.092 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.093 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.093 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.093 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.093 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.093 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.093 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.094 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.094 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.094 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.094 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.094 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.094 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] compute.packing_host_numa_cells_allocation_strategy = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.095 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.095 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.095 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.095 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.096 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] conductor.workers = 4 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.096 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.096 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.096 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.096 129202 DEBUG nova.service [req-97bfe95c-3e01-452f-b08c-16a815bd3f04 - - - - -] Creating RPC server for service scheduler start /usr/lib/python3/dist-packages/nova/service.py:182 2026-04-23 00:40:35.096 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.097 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.097 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.097 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.097 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.097 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.097 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.098 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.098 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.098 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.098 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.098 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.098 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.099 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.099 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.099 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.099 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.099 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.099 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.100 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] cyborg.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.100 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.100 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.connection = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.100 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.100 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.100 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.101 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.101 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.101 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.101 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.101 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.102 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.102 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.max_pool_size = 4 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.102 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.102 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.102 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.103 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.103 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.103 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.103 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.103 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] database.use_db_reconnect = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.103 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.104 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.104 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.104 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.104 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.104 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.104 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.104 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.105 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.105 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.105 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.105 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.max_pool_size = 4 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.105 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.105 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.106 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.106 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.106 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.106 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.106 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.106 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.107 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.107 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.107 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.107 127783 WARNING oslo_config.cfg [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Deprecated: Option "api_servers" from group "glance" is deprecated for removal ( Support for image service configuration via standard keystoneauth1 Adapter options was added in the 17.0.0 Queens release. The api_servers option was retained temporarily to allow consumers time to cut over to a real load balancing solution. ). Its value may be silently ignored in the future. 2026-04-23 00:40:35.107 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.api_servers = ['http://252.129.206.87:9292'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.108 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.108 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.108 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.108 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.108 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.108 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.debug = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.108 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.109 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.109 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.109 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.109 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.109 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:35.109 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.109 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:35.109 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.110 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.110 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.110 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.110 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.110 129201 DEBUG nova.service [req-823825d8-c851-4bfa-b486-95fcef5c691f - - - - -] Join ServiceGroup membership for this service scheduler start /usr/lib/python3/dist-packages/nova/service.py:199 2026-04-23 00:40:35.110 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.110 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.110 129201 DEBUG nova.servicegroup.drivers.db [req-823825d8-c851-4bfa-b486-95fcef5c691f - - - - -] DB_Driver: join new ServiceGroup member juju-0097f2-0-lxd-7 to the scheduler group, service = join /usr/lib/python3/dist-packages/nova/servicegroup/drivers/db.py:44 2026-04-23 00:40:35.111 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.111 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.111 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.service_type = image log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.111 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.111 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.111 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.111 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.112 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.112 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.112 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] glance.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.112 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.112 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.113 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.113 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.113 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.113 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.113 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.113 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:35.113 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.113 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:35.113 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.114 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.114 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:35.114 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.114 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.114 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.114 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.114 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.115 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.115 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.115 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.115 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.115 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] mks.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.116 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.116 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.116 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.116 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.117 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.117 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.117 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.117 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.117 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.117 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.118 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.118 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:35.118 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.118 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.118 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:35.118 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.118 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.118 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.119 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.118 129199 DEBUG nova.service [req-f190fc88-ef2f-40c3-973c-af21f6be856a - - - - -] Join ServiceGroup membership for this service scheduler start /usr/lib/python3/dist-packages/nova/service.py:199 2026-04-23 00:40:35.119 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.119 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.119 129199 DEBUG nova.servicegroup.drivers.db [req-f190fc88-ef2f-40c3-973c-af21f6be856a - - - - -] DB_Driver: join new ServiceGroup member juju-0097f2-0-lxd-7 to the scheduler group, service = join /usr/lib/python3/dist-packages/nova/servicegroup/drivers/db.py:44 2026-04-23 00:40:35.119 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.119 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.119 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.119 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.120 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.120 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.120 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.120 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.120 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.120 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.120 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.121 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.120 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:35.121 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.121 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:35.121 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ironic.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.121 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.121 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:35.121 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.121 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:35.121 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:35.121 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.122 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:35.122 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:35.122 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.122 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.122 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.barbican_endpoint_type = public log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.122 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.122 129202 DEBUG nova.service [req-97bfe95c-3e01-452f-b08c-16a815bd3f04 - - - - -] Join ServiceGroup membership for this service scheduler start /usr/lib/python3/dist-packages/nova/service.py:199 2026-04-23 00:40:35.122 129200 DEBUG nova.service [req-ce0be29e-7def-4a81-91da-5a8b64c8aba2 - - - - -] Join ServiceGroup membership for this service scheduler start /usr/lib/python3/dist-packages/nova/service.py:199 2026-04-23 00:40:35.122 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.122 129202 DEBUG nova.servicegroup.drivers.db [req-97bfe95c-3e01-452f-b08c-16a815bd3f04 - - - - -] DB_Driver: join new ServiceGroup member juju-0097f2-0-lxd-7 to the scheduler group, service = join /usr/lib/python3/dist-packages/nova/servicegroup/drivers/db.py:44 2026-04-23 00:40:35.122 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.122 129200 DEBUG nova.servicegroup.drivers.db [req-ce0be29e-7def-4a81-91da-5a8b64c8aba2 - - - - -] DB_Driver: join new ServiceGroup member juju-0097f2-0-lxd-7 to the scheduler group, service = join /usr/lib/python3/dist-packages/nova/servicegroup/drivers/db.py:44 2026-04-23 00:40:35.123 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.123 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.123 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.123 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.123 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.123 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.124 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.124 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.124 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.124 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.124 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.124 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.125 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.124 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:35.125 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:35.125 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.125 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:35.125 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:35.125 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:35.125 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:35.125 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.125 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.125 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.125 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.126 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.126 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.126 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.126 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vault.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.126 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vault.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.126 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.126 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vault.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.127 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.127 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.127 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.127 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vault.namespace = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.127 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.127 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.128 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.128 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vault.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.128 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.128 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.128 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.128 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.129 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.129 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.129 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.129 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.129 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.129 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.129 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.130 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.130 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.130 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.130 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.130 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.130 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.131 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.131 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.131 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.131 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] keystone.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.131 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.131 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.cpu_mode = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.132 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.132 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.132 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.132 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.132 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.132 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.132 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.133 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.133 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.133 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.133 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.hw_machine_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.133 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.images_rbd_ceph_conf = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.133 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.134 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.134 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.images_rbd_glance_store_name = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.134 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.images_rbd_pool = rbd log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.134 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.images_type = default log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.134 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.134 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.135 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.135 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.135 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.135 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.135 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.135 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.136 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.136 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.136 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.136 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.136 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.live_migration_permit_auto_converge = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.136 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.live_migration_permit_post_copy = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.136 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.137 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.live_migration_timeout_action = abort log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.137 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.137 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.live_migration_uri = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.137 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.137 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.137 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.138 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.138 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.138 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.138 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.138 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.139 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.139 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.num_pcie_ports = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.139 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.139 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.139 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.139 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.140 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.140 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.140 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.140 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.rbd_secret_uuid = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.140 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.rbd_user = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.140 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.141 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.141 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.141 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.141 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.141 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.141 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.rx_queue_size = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.141 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.142 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.142 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.142 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.142 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.144 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.144 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.swtpm_enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.144 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.144 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.144 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.145 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.tx_queue_size = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.145 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.145 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.145 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.145 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.145 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.146 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.volume_use_multipath = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.146 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.147 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.147 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.147 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.147 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.147 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.148 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.148 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.148 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.148 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.148 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.148 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.149 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.150 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.150 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.150 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.150 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.150 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.151 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.151 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.152 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.152 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.152 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.152 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.153 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.153 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.153 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.region_name = RegionOne log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.153 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.153 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.154 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.154 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.155 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.155 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.155 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.155 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.155 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] neutron.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.155 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.156 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.156 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.156 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.156 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.156 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] pci.alias = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.156 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] pci.passthrough_whitelist = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.157 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.157 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.158 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.auth_url = http://252.129.220.164:35357 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.158 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.158 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.158 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.158 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.158 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.159 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.159 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.159 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.159 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.159 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.159 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.160 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.160 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.160 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.160 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.password = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.160 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.160 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.project_domain_name = service_domain log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.161 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.project_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.161 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.project_name = services log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.162 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.region_name = RegionOne log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.162 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.162 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.162 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.162 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.163 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.163 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.163 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.163 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.163 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.163 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.user_domain_name = service_domain log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.163 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.user_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.164 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.username = nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.165 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.165 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] placement.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.165 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] powervm.disk_driver = localdisk log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.165 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] powervm.proc_units_factor = 0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.165 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] powervm.volume_group_name = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.166 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.166 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.166 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.167 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.167 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.167 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.168 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.168 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.168 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.169 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.169 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.169 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.170 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.170 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.170 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.170 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] scheduler.discover_hosts_in_cells_interval = 30 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.171 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.171 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.171 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.171 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.171 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.171 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.172 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.173 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.173 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.173 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] scheduler.workers = 4 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.173 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.173 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.174 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.174 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.build_failure_weight_multiplier = 0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.174 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.174 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.174 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.175 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.enabled_filters = ['AvailabilityZoneFilter', 'ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'DifferentHostFilter', 'SameHostFilter'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.175 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.175 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.175 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.175 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.175 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.176 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.176 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.176 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.176 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.176 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.176 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.177 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.177 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.177 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.177 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.177 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] metrics.required = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.177 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.178 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.178 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.178 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] serial_console.base_url = wss://252.129.195.220:6083/ log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.178 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.178 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.179 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.179 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.179 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.179 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.179 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.179 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.180 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.180 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.180 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.180 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.180 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.180 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.180 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.181 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.181 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] spice.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.181 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.181 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.181 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.182 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.182 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.182 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.182 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.182 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.182 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.183 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.183 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.183 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.183 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.183 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.183 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.184 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.184 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.184 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.184 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.184 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.184 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.185 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.185 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.185 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.185 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.185 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.186 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.186 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.187 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.187 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.187 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.187 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.187 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.187 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.188 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.188 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.188 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.188 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.188 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.188 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.188 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.189 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.189 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.189 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.189 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.189 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.190 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.190 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.190 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.190 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vnc.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.190 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vnc.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.190 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.191 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.191 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.191 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.191 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.191 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.191 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.192 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.192 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.192 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.192 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.enable_qemu_monitor_announce_self = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.192 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.192 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.192 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.193 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.193 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.reserve_disk_resource_for_image_cache = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.193 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.193 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.skip_cpu_compare_on_dest = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.193 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.193 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.194 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.194 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] wsgi.api_paste_config = /etc/nova/api-paste.ini log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.194 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.194 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.194 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.194 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.195 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.195 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.195 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.195 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.195 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.195 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.196 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.196 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.196 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.196 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.196 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.196 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.197 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.197 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.197 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.197 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.197 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.198 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.198 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.198 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.198 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.198 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.198 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.198 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.199 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.199 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.199 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.199 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.199 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] profiler.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.200 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] profiler.es_doc_type = notification log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.200 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.200 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.200 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] profiler.filter_error_trace = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.200 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.200 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.201 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.201 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.201 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.201 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.201 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.201 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.202 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.202 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.202 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.202 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.202 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.202 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.203 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.203 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.203 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.203 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.203 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.203 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.204 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.204 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.204 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.204 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.204 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.204 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.205 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.205 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.205 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.205 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.205 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.205 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.205 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.206 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.206 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.206 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.206 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.206 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.207 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.auth_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.207 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.207 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.207 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.207 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.207 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.208 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.208 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.208 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.208 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.208 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.208 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.208 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.209 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.209 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.209 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.209 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.209 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.210 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.210 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.210 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.210 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.210 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.210 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:40:35.210 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2613 2026-04-23 00:40:36.119 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:36.119 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:36.119 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:36.122 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:36.122 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:36.123 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:36.126 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:36.126 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:36.126 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:36.126 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:36.127 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:36.127 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:38.121 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:38.122 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:38.122 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:38.125 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:38.125 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:38.125 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:38.128 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:38.128 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:38.129 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:38.129 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:38.129 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:38.129 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:42.125 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:42.125 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:42.125 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:42.128 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:42.128 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:42.128 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:42.131 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:42.132 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:42.132 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:42.132 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:42.132 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:42.132 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:50.126 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:50.126 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:50.126 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:50.129 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:50.129 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:50.129 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:50.132 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:50.133 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:50.133 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:50.134 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:40:50.134 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:40:50.134 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:40:57.131 129200 DEBUG oslo_service.periodic_task [req-93ba5916-c418-43cd-a86d-4ee8b750e86d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:40:57.132 129200 DEBUG oslo_db.sqlalchemy.engines [req-93ba5916-c418-43cd-a86d-4ee8b750e86d - - - - -] Parent process 127783 forked (129200) with an open database connection, which is being discarded and recreated. checkout /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:413 2026-04-23 00:40:57.146 129200 DEBUG oslo_concurrency.lockutils [req-93ba5916-c418-43cd-a86d-4ee8b750e86d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:40:57.147 129200 DEBUG oslo_concurrency.lockutils [req-93ba5916-c418-43cd-a86d-4ee8b750e86d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:40:57.152 129200 DEBUG oslo_db.sqlalchemy.engines [req-93ba5916-c418-43cd-a86d-4ee8b750e86d - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:41:14.127 129202 DEBUG oslo_service.periodic_task [req-88c2bb31-f05a-49c5-bc7b-ae53efb7688b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:41:14.128 129202 DEBUG oslo_db.sqlalchemy.engines [req-88c2bb31-f05a-49c5-bc7b-ae53efb7688b - - - - -] Parent process 127783 forked (129202) with an open database connection, which is being discarded and recreated. checkout /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:413 2026-04-23 00:41:14.135 129202 DEBUG oslo_concurrency.lockutils [req-88c2bb31-f05a-49c5-bc7b-ae53efb7688b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:41:14.137 129202 DEBUG oslo_concurrency.lockutils [req-88c2bb31-f05a-49c5-bc7b-ae53efb7688b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:41:14.142 129202 DEBUG oslo_db.sqlalchemy.engines [req-88c2bb31-f05a-49c5-bc7b-ae53efb7688b - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:41:20.100 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:20.100 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:20.100 129199 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:20.102 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:20.103 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:20.103 129200 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:20.108 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:20.108 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:20.108 129201 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:20.116 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:20.117 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:20.117 129202 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:22.113 129201 DEBUG oslo_service.periodic_task [req-79ed35b1-5a94-4f7c-9abd-5e33e8260f29 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:41:22.114 129201 DEBUG oslo_db.sqlalchemy.engines [req-79ed35b1-5a94-4f7c-9abd-5e33e8260f29 - - - - -] Parent process 127783 forked (129201) with an open database connection, which is being discarded and recreated. checkout /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:413 2026-04-23 00:41:22.120 129201 DEBUG oslo_concurrency.lockutils [req-79ed35b1-5a94-4f7c-9abd-5e33e8260f29 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:41:22.122 129201 DEBUG oslo_concurrency.lockutils [req-79ed35b1-5a94-4f7c-9abd-5e33e8260f29 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:41:22.127 129201 DEBUG oslo_db.sqlalchemy.engines [req-79ed35b1-5a94-4f7c-9abd-5e33e8260f29 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:41:27.176 129200 DEBUG oslo_service.periodic_task [req-93ba5916-c418-43cd-a86d-4ee8b750e86d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:41:27.180 129200 DEBUG oslo_concurrency.lockutils [req-717d2ed7-c90e-49d4-9f93-996507ff58e8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:41:27.180 129200 DEBUG oslo_concurrency.lockutils [req-717d2ed7-c90e-49d4-9f93-996507ff58e8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:41:31.121 129199 DEBUG oslo_service.periodic_task [req-3acd54b5-e04b-4337-80a4-856359d9719d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:41:31.122 129199 DEBUG oslo_db.sqlalchemy.engines [req-3acd54b5-e04b-4337-80a4-856359d9719d - - - - -] Parent process 127783 forked (129199) with an open database connection, which is being discarded and recreated. checkout /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:413 2026-04-23 00:41:31.129 129199 DEBUG oslo_concurrency.lockutils [req-3acd54b5-e04b-4337-80a4-856359d9719d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:41:31.131 129199 DEBUG oslo_concurrency.lockutils [req-3acd54b5-e04b-4337-80a4-856359d9719d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:41:31.137 129199 DEBUG oslo_db.sqlalchemy.engines [req-3acd54b5-e04b-4337-80a4-856359d9719d - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:41:43.775 127783 INFO oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Caught SIGTERM, stopping children 2026-04-23 00:41:43.775 127783 DEBUG oslo_concurrency.lockutils [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:294 2026-04-23 00:41:43.775 127783 DEBUG oslo_concurrency.lockutils [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:312 2026-04-23 00:41:43.776 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Stop services. stop /usr/lib/python3/dist-packages/oslo_service/service.py:695 2026-04-23 00:41:43.776 127783 DEBUG oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Killing children. stop /usr/lib/python3/dist-packages/oslo_service/service.py:700 2026-04-23 00:41:43.778 127783 INFO oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Waiting on 4 children to exit 2026-04-23 00:41:43.782 127783 INFO oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Child 129199 killed by signal 15 2026-04-23 00:41:43.783 127783 INFO oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Child 129201 killed by signal 15 2026-04-23 00:41:43.784 127783 INFO oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Child 129202 killed by signal 15 2026-04-23 00:41:43.786 127783 INFO oslo_service.service [req-e3ab25a9-8cdf-4bce-ab8e-ec82e9c56c08 - - - - -] Child 129200 killed by signal 15 2026-04-23 00:41:45.843 133300 DEBUG oslo_db.sqlalchemy.engines [req-935a501c-f8d7-48c8-acb0-ff0ec3c51700 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:41:45.877 133300 DEBUG nova.scheduler.host_manager [req-935a501c-f8d7-48c8-acb0-ff0ec3c51700 - - - - -] Found 1 cells: de8a22a1-b255-45ae-baf4-856041bc1c3f refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-23 00:41:45.877 133300 DEBUG nova.scheduler.host_manager [req-935a501c-f8d7-48c8-acb0-ff0ec3c51700 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-23 00:41:45.883 133300 WARNING nova.scheduler.filters.availability_zone_filter [req-935a501c-f8d7-48c8-acb0-ff0ec3c51700 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-23 00:41:45.962 133300 DEBUG nova.scheduler.host_manager [req-935a501c-f8d7-48c8-acb0-ff0ec3c51700 - - - - -] START:_async_init_instance_info _async_init_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:411 2026-04-23 00:41:45.964 133300 DEBUG oslo_concurrency.lockutils [req-c34b9b1c-d29d-49ad-bf43-d08b3426c46f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:41:45.965 133300 DEBUG oslo_concurrency.lockutils [req-c34b9b1c-d29d-49ad-bf43-d08b3426c46f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:41:45.970 133300 DEBUG oslo_db.sqlalchemy.engines [req-c34b9b1c-d29d-49ad-bf43-d08b3426c46f - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:41:45.981 133300 DEBUG nova.scheduler.host_manager [req-c34b9b1c-d29d-49ad-bf43-d08b3426c46f - - - - -] Total number of compute nodes: 0 _async_init_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:424 2026-04-23 00:41:45.982 133300 DEBUG oslo_concurrency.lockutils [req-c34b9b1c-d29d-49ad-bf43-d08b3426c46f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:41:45.983 133300 DEBUG oslo_concurrency.lockutils [req-c34b9b1c-d29d-49ad-bf43-d08b3426c46f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:41:46.048 133300 DEBUG nova.scheduler.host_manager [req-c34b9b1c-d29d-49ad-bf43-d08b3426c46f - - - - -] Adding 0 instances for hosts 10-20 _async_init_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:443 2026-04-23 00:41:46.049 133300 DEBUG nova.scheduler.host_manager [req-c34b9b1c-d29d-49ad-bf43-d08b3426c46f - - - - -] END:_async_init_instance_info _async_init_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:454 2026-04-23 00:41:47.865 133300 DEBUG nova.context [req-935a501c-f8d7-48c8-acb0-ff0ec3c51700 - - - - -] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),de8a22a1-b255-45ae-baf4-856041bc1c3f(cell1) load_cells /usr/lib/python3/dist-packages/nova/context.py:464 2026-04-23 00:41:47.866 133300 DEBUG oslo_concurrency.lockutils [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:41:47.867 133300 DEBUG oslo_concurrency.lockutils [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:41:47.867 133300 DEBUG oslo_concurrency.lockutils [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:41:47.867 133300 DEBUG oslo_concurrency.lockutils [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:41:47.876 133300 DEBUG oslo_db.sqlalchemy.engines [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:41:47.885 133300 DEBUG oslo_concurrency.lockutils [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:294 2026-04-23 00:41:47.885 133300 DEBUG oslo_concurrency.lockutils [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:312 2026-04-23 00:41:47.885 133300 INFO oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] Starting 4 workers 2026-04-23 00:41:47.889 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] Started child 134138 _start_child /usr/lib/python3/dist-packages/oslo_service/service.py:575 2026-04-23 00:41:47.893 134138 INFO nova.service [-] Starting scheduler node (version 25.2.1) 2026-04-23 00:41:47.894 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] Started child 134140 _start_child /usr/lib/python3/dist-packages/oslo_service/service.py:575 2026-04-23 00:41:47.899 134140 INFO nova.service [-] Starting scheduler node (version 25.2.1) 2026-04-23 00:41:47.905 134138 DEBUG oslo_db.sqlalchemy.engines [req-f4acda3d-1e5f-4f0d-9e2d-36fc0c302a95 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:41:47.938 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] Started child 134145 _start_child /usr/lib/python3/dist-packages/oslo_service/service.py:575 2026-04-23 00:41:47.942 134145 INFO nova.service [-] Starting scheduler node (version 25.2.1) 2026-04-23 00:41:47.943 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] Started child 134146 _start_child /usr/lib/python3/dist-packages/oslo_service/service.py:575 2026-04-23 00:41:47.945 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] Full set of CONF: wait /usr/lib/python3/dist-packages/oslo_service/service.py:649 2026-04-23 00:41:47.945 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2589 2026-04-23 00:41:47.945 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2590 2026-04-23 00:41:47.945 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] command line args: ['--config-file=/etc/nova/nova.conf', '--log-file=/var/log/nova/nova-scheduler.log'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2591 2026-04-23 00:41:47.946 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] config files: ['/etc/nova/nova.conf'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2592 2026-04-23 00:41:47.946 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ================================================================================ log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2594 2026-04-23 00:41:47.946 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.946 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.947 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.947 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.947 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cert = self.pem log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.947 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.947 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.947 134146 INFO nova.service [-] Starting scheduler node (version 25.2.1) 2026-04-23 00:41:47.948 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] config_dir = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.948 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.948 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] config_file = ['/etc/nova/nova.conf'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.948 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] config_source = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.949 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] console_host = juju-0097f2-0-lxd-7 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.949 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] control_exchange = nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.949 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cpu_allocation_ratio = 2.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.949 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] daemon = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.949 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] debug = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.950 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.950 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.950 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.950 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=DEBUG', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.950 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.951 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] disk_allocation_ratio = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.951 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] enable_new_services = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.951 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.951 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.951 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] fatal_deprecations = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.951 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] flat_injected = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.952 134138 DEBUG nova.service [req-f4acda3d-1e5f-4f0d-9e2d-36fc0c302a95 - - - - -] Creating RPC server for service scheduler start /usr/lib/python3/dist-packages/nova/service.py:182 2026-04-23 00:41:47.953 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] force_config_drive = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.954 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] force_raw_images = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.954 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.955 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.956 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] host = juju-0097f2-0-lxd-7 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.957 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] initial_cpu_allocation_ratio = 16.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.958 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] initial_disk_allocation_ratio = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.958 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] initial_ram_allocation_ratio = 1.5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.958 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] injected_network_template = /usr/lib/python3/dist-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.959 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.959 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.959 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.959 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.962 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.963 134146 DEBUG oslo_db.sqlalchemy.engines [req-650dfd50-bd20-4d9b-8bba-cb4803376ad1 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:41:47.967 134145 DEBUG oslo_db.sqlalchemy.engines [req-5726421a-ba7b-4eb5-966f-ea797f430a12 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:41:47.971 134140 DEBUG oslo_db.sqlalchemy.engines [req-c95814ce-f470-4603-85f9-5aaaaf1d46db - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:41:47.962 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.978 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.978 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.978 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.978 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] key = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.984 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.985 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] log_config_append = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.985 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.985 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] log_dir = /var/log/nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.986 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] log_file = /var/log/nova/nova-scheduler.log log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.986 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] log_options = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.986 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.986 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.986 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] log_rotation_type = none log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.986 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.987 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.987 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.987 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.987 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.987 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.987 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.988 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.988 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.998 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.999 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] max_logfile_count = 30 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.999 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:47.999 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.002 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.002 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] metadata_listen_port = 8765 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.003 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] metadata_workers = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.003 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.006 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:48.007 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:48.003 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] mkisofs_cmd = genisoimage log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.009 134140 DEBUG nova.service [req-c95814ce-f470-4603-85f9-5aaaaf1d46db - - - - -] Creating RPC server for service scheduler start /usr/lib/python3/dist-packages/nova/service.py:182 2026-04-23 00:41:48.009 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] my_block_storage_ip = 252.129.195.220 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.009 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] my_ip = 252.129.195.220 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.009 134138 DEBUG nova.service [req-f4acda3d-1e5f-4f0d-9e2d-36fc0c302a95 - - - - -] Join ServiceGroup membership for this service scheduler start /usr/lib/python3/dist-packages/nova/service.py:199 2026-04-23 00:41:48.009 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.009 134138 DEBUG nova.servicegroup.drivers.db [req-f4acda3d-1e5f-4f0d-9e2d-36fc0c302a95 - - - - -] DB_Driver: join new ServiceGroup member juju-0097f2-0-lxd-7 to the scheduler group, service = join /usr/lib/python3/dist-packages/nova/servicegroup/drivers/db.py:44 2026-04-23 00:41:48.010 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:48.010 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:48.010 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:48.010 134145 DEBUG nova.service [req-5726421a-ba7b-4eb5-966f-ea797f430a12 - - - - -] Creating RPC server for service scheduler start /usr/lib/python3/dist-packages/nova/service.py:182 2026-04-23 00:41:48.009 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.012 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.013 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] osapi_compute_listen_port = 8764 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.013 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.013 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] osapi_compute_workers = 4 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.013 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] password_length = 12 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.013 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] periodic_enable = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.016 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.016 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.016 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] preallocate_images = none log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.017 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] publish_errors = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.017 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] pybasedir = /usr/lib/python3/dist-packages log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.017 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ram_allocation_ratio = 0.98 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.017 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.017 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.020 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.021 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.021 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.021 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] record = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.021 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] report_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.021 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.021 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.022 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.022 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.022 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.022 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.022 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.022 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.023 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.023 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.023 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.023 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.023 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.023 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.024 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.024 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.024 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.024 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.024 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.024 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.024 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ssl_only = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.025 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.025 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.025 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.025 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.025 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] tempdir = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.025 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.026 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] transport_url = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.026 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.026 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] use_cow_images = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.026 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] use_eventlog = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.026 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] use_journal = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.026 134146 DEBUG nova.service [req-650dfd50-bd20-4d9b-8bba-cb4803376ad1 - - - - -] Creating RPC server for service scheduler start /usr/lib/python3/dist-packages/nova/service.py:182 2026-04-23 00:41:48.026 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] use_json = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.027 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.027 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] use_stderr = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.027 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] use_syslog = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.027 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.027 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.027 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.028 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.028 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.028 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] watch_log_file = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.028 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-23 00:41:48.028 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.029 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_concurrency.lock_path = /var/lock/nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.029 133300 WARNING oslo_config.cfg [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] Deprecated: Option "auth_strategy" from group "api" is deprecated for removal ( The only non-default choice, ``noauth2``, is for internal development and testing purposes only and should not be used in deployments. This option and its middleware, NoAuthMiddleware[V2_18], will be removed in a future release. ). Its value may be silently ignored in the future. 2026-04-23 00:41:48.029 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.029 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.030 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.030 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.dhcp_domain = novalocal log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.030 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.030 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.030 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.030 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.031 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.031 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.031 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.031 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.031 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.031 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.032 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.032 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.032 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.032 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.032 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.032 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.033 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.033 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.033 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.033 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.backend = dogpile.cache.null log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.033 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.033 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.034 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.034 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.034 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.034 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.035 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.035 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.035 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.035 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.035 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.035 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.035 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.036 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.036 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.036 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.036 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.036 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.037 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.037 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cinder.auth_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.037 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.037 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cinder.catalog_info = volumev3::publicURL log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.037 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.037 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.038 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.038 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cinder.debug = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.038 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.038 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.038 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.038 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.039 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:48.040 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:48.040 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.040 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.040 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.041 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.042 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.042 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:48.042 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.042 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:48.042 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.042 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.043 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.043 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.043 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] compute.packing_host_numa_cells_allocation_strategy = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.043 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.044 134140 DEBUG nova.service [req-c95814ce-f470-4603-85f9-5aaaaf1d46db - - - - -] Join ServiceGroup membership for this service scheduler start /usr/lib/python3/dist-packages/nova/service.py:199 2026-04-23 00:41:48.044 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.044 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.044 134140 DEBUG nova.servicegroup.drivers.db [req-c95814ce-f470-4603-85f9-5aaaaf1d46db - - - - -] DB_Driver: join new ServiceGroup member juju-0097f2-0-lxd-7 to the scheduler group, service = join /usr/lib/python3/dist-packages/nova/servicegroup/drivers/db.py:44 2026-04-23 00:41:48.045 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.045 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] conductor.workers = 4 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.045 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:48.045 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.045 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:48.045 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:48.045 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.040 134145 DEBUG nova.service [req-5726421a-ba7b-4eb5-966f-ea797f430a12 - - - - -] Join ServiceGroup membership for this service scheduler start /usr/lib/python3/dist-packages/nova/service.py:199 2026-04-23 00:41:48.045 134145 DEBUG nova.servicegroup.drivers.db [req-5726421a-ba7b-4eb5-966f-ea797f430a12 - - - - -] DB_Driver: join new ServiceGroup member juju-0097f2-0-lxd-7 to the scheduler group, service = join /usr/lib/python3/dist-packages/nova/servicegroup/drivers/db.py:44 2026-04-23 00:41:48.045 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.048 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:48.050 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:48.050 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:48.050 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.051 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.051 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.051 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.051 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.051 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.052 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.052 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.052 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.052 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.052 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.052 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.052 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.053 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.053 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.053 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.053 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.053 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.053 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.054 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] cyborg.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.054 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.054 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.connection = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.054 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.054 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.055 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.055 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.055 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.055 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.055 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.055 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.055 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.057 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:48.058 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:48.058 134146 DEBUG nova.service [req-650dfd50-bd20-4d9b-8bba-cb4803376ad1 - - - - -] Join ServiceGroup membership for this service scheduler start /usr/lib/python3/dist-packages/nova/service.py:199 2026-04-23 00:41:48.059 134146 DEBUG nova.servicegroup.drivers.db [req-650dfd50-bd20-4d9b-8bba-cb4803376ad1 - - - - -] DB_Driver: join new ServiceGroup member juju-0097f2-0-lxd-7 to the scheduler group, service = join /usr/lib/python3/dist-packages/nova/servicegroup/drivers/db.py:44 2026-04-23 00:41:48.062 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:48.062 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:48.062 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:48.056 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.max_pool_size = 4 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.064 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.065 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.065 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.065 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.065 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.066 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.066 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.066 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] database.use_db_reconnect = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.066 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.067 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.067 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.067 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.067 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.067 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.068 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.068 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.068 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.068 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.068 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.069 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.max_pool_size = 4 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.069 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.069 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.069 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.069 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.069 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.070 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.070 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.070 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.070 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.070 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.071 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.071 133300 WARNING oslo_config.cfg [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] Deprecated: Option "api_servers" from group "glance" is deprecated for removal ( Support for image service configuration via standard keystoneauth1 Adapter options was added in the 17.0.0 Queens release. The api_servers option was retained temporarily to allow consumers time to cut over to a real load balancing solution. ). Its value may be silently ignored in the future. 2026-04-23 00:41:48.071 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.api_servers = ['http://252.129.206.87:9292'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.071 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.072 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.072 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.072 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.072 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.072 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.debug = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.072 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.073 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.073 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.073 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.073 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.073 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.073 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.074 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.074 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.074 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.074 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.074 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.074 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.075 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.075 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.075 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.service_type = image log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.075 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.076 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.076 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.076 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.076 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.076 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.077 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] glance.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.077 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.078 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.078 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.078 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.079 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.079 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.080 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.080 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.080 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.080 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.081 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.081 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.082 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.082 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.082 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.084 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.084 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.084 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.084 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.085 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] mks.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.086 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.087 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.087 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.087 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.087 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.088 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.088 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.088 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.088 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.089 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.089 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.089 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.089 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.089 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.090 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.090 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.090 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.090 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.090 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.091 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.091 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.091 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.091 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.091 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.091 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.092 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.092 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.092 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.092 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.093 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.093 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.093 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.093 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ironic.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.094 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.094 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.094 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.094 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.095 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.095 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.barbican_endpoint_type = public log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.095 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.095 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.095 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.096 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.096 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.096 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.096 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.096 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.096 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.097 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.097 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.097 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.097 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.097 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.097 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.098 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.098 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.098 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.098 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.098 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.099 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.099 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.099 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.099 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.099 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vault.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.100 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vault.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.100 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.100 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vault.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.100 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.100 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.100 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.101 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vault.namespace = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.101 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.101 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.101 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.101 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vault.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.102 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.102 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.102 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.102 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.102 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.103 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.103 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.103 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.103 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.103 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.104 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.104 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.104 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.104 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.104 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.105 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.105 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.105 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.105 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.105 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.106 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] keystone.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.106 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.106 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.cpu_mode = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.106 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.107 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.107 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.107 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.107 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.107 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.107 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.108 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.108 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.108 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.108 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.hw_machine_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.108 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.images_rbd_ceph_conf = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.108 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.109 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.109 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.images_rbd_glance_store_name = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.109 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.images_rbd_pool = rbd log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.109 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.images_type = default log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.109 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.110 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.110 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.110 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.110 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.110 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.111 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.111 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.111 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.111 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.111 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.112 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.112 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.live_migration_permit_auto_converge = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.112 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.live_migration_permit_post_copy = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.112 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.112 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.live_migration_timeout_action = abort log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.112 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.113 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.live_migration_uri = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.113 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.113 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.113 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.113 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.114 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.114 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.114 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.114 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.115 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.115 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.num_pcie_ports = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.115 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.115 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.116 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.116 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.116 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.116 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.116 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.116 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.rbd_secret_uuid = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.117 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.rbd_user = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.117 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.117 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.117 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.117 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.117 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.118 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.118 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.rx_queue_size = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.118 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.118 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.118 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.118 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.119 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.119 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.119 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.swtpm_enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.119 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.120 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.120 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.120 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.tx_queue_size = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.120 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.120 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.120 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.121 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.121 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.122 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.volume_use_multipath = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.122 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.122 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.122 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.122 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.122 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.123 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.123 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.123 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.123 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.124 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.124 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.124 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.124 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.125 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.125 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.125 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.125 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.125 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.126 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.126 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.126 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.126 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.126 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.126 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.126 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.127 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.127 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.region_name = RegionOne log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.127 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.127 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.127 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.127 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.128 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.128 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.128 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.128 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.128 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] neutron.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.128 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.129 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.129 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.129 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.129 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.129 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] pci.alias = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.130 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] pci.passthrough_whitelist = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.130 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.130 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.130 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.auth_url = http://252.129.220.164:35357 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.130 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.131 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.131 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.131 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.131 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.131 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.131 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.132 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.132 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.132 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.132 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.132 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.132 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.133 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.133 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.password = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.133 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.133 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.project_domain_name = service_domain log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.133 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.project_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.133 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.project_name = services log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.134 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.region_name = RegionOne log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.134 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.134 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.134 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.135 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.135 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.135 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.135 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.135 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.136 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.136 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.user_domain_name = service_domain log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.136 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.user_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.136 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.username = nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.136 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.136 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] placement.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.137 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] powervm.disk_driver = localdisk log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.137 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] powervm.proc_units_factor = 0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.137 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] powervm.volume_group_name = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.137 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.137 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.138 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.138 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.138 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.138 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.138 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.139 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.139 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.139 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.139 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.139 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.139 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.140 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.140 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.140 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] scheduler.discover_hosts_in_cells_interval = 30 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.140 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.141 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.141 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.141 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.141 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.142 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.142 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.143 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.144 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.144 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] scheduler.workers = 4 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.144 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.144 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.145 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.146 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.build_failure_weight_multiplier = 0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.146 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.146 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.146 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.147 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.enabled_filters = ['AvailabilityZoneFilter', 'ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'DifferentHostFilter', 'SameHostFilter'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.147 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.147 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.147 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.147 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.149 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.149 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.149 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.149 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.149 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.150 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.150 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.150 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.150 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.150 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.151 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.152 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] metrics.required = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.152 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.152 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.152 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.152 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] serial_console.base_url = wss://252.129.195.220:6083/ log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.153 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.153 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.153 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.153 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.153 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.154 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.154 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.154 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.154 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.154 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.155 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.155 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.155 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.155 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.155 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.156 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.156 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] spice.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.156 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.156 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.157 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.158 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.158 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.158 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.158 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.158 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.158 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.159 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.159 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.159 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.159 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.160 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.160 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.161 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.161 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.161 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.161 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.161 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.161 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.162 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.162 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.162 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.162 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.162 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.162 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.163 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.163 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.163 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.163 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.163 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.163 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.164 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.164 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.164 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.164 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.164 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.164 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.164 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.165 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.165 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.165 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.165 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.165 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.166 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.166 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.166 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.167 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.168 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vnc.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.168 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vnc.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.168 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.168 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.168 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.169 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.169 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.169 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.169 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.169 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.170 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.170 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.170 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.enable_qemu_monitor_announce_self = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.170 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.171 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.171 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.171 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.172 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.reserve_disk_resource_for_image_cache = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.172 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.172 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.skip_cpu_compare_on_dest = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.172 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.172 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.172 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.173 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] wsgi.api_paste_config = /etc/nova/api-paste.ini log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.173 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.173 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.173 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.173 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.174 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.174 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.174 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.174 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.174 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.176 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.176 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.176 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.177 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.177 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.177 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.177 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.178 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.178 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.178 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.179 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.179 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.179 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.179 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.179 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.180 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.181 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.181 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.181 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.181 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.181 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.181 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.182 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.182 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] profiler.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.183 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] profiler.es_doc_type = notification log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.183 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.183 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.183 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] profiler.filter_error_trace = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.183 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.184 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.184 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.184 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.184 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.184 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.185 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.185 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.185 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.185 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.186 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.186 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.186 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.187 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.187 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.187 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.187 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.188 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.188 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.189 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.189 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.189 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.189 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.190 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.190 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.191 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.191 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.191 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.191 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.192 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.192 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.192 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.193 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.193 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.193 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.193 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.194 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.194 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.194 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.auth_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.194 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.194 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.195 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.195 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.195 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.195 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.196 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.197 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.197 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.197 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.197 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.197 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.198 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.198 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.198 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.199 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.199 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.199 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.200 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.200 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.200 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.200 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.201 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-23 00:41:48.201 133300 DEBUG oslo_service.service [req-96e3b746-8575-4071-9a6b-d69f37069f50 - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2613 2026-04-23 00:41:49.012 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:49.012 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:49.012 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:49.046 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:49.046 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:49.047 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:49.051 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:49.051 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:49.052 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:49.064 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:49.064 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:49.064 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:51.015 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:51.015 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:51.015 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:51.049 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:51.049 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:51.049 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:51.052 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:51.053 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:51.053 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:51.066 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:51.067 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:51.067 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:55.018 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:55.019 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:55.019 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:55.051 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:55.051 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:55.051 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:55.056 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:55.056 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:55.056 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:41:55.070 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:41:55.070 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:41:55.071 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:42:03.019 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:42:03.020 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:42:03.020 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:42:03.052 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:42:03.052 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:42:03.053 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:42:03.057 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:42:03.058 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:42:03.058 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:42:03.071 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:42:03.072 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:42:03.072 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:42:30.067 134146 DEBUG oslo_service.periodic_task [req-d252ecd0-be21-4122-b3bc-49c0b6a35812 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:42:30.069 134146 DEBUG oslo_db.sqlalchemy.engines [req-d252ecd0-be21-4122-b3bc-49c0b6a35812 - - - - -] Parent process 133300 forked (134146) with an open database connection, which is being discarded and recreated. checkout /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:413 2026-04-23 00:42:30.075 134146 DEBUG oslo_concurrency.lockutils [req-d252ecd0-be21-4122-b3bc-49c0b6a35812 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:42:30.076 134146 DEBUG oslo_concurrency.lockutils [req-d252ecd0-be21-4122-b3bc-49c0b6a35812 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:42:30.082 134146 DEBUG oslo_db.sqlalchemy.engines [req-d252ecd0-be21-4122-b3bc-49c0b6a35812 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:42:30.134 134146 INFO nova.scheduler.manager [req-d252ecd0-be21-4122-b3bc-49c0b6a35812 - - - - -] Discovered 1 new hosts: cell1:cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 00:42:32.994 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:42:32.994 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:42:32.994 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:42:33.034 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:42:33.034 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:42:33.034 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:42:33.035 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:42:33.036 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:42:33.036 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:42:33.050 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:42:33.050 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:42:33.050 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:42:40.018 134138 DEBUG oslo_service.periodic_task [req-cc0ac6cc-4874-4f3f-8274-c69f956e7b68 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:42:40.019 134138 DEBUG oslo_db.sqlalchemy.engines [req-cc0ac6cc-4874-4f3f-8274-c69f956e7b68 - - - - -] Parent process 133300 forked (134138) with an open database connection, which is being discarded and recreated. checkout /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:413 2026-04-23 00:42:40.026 134138 DEBUG oslo_concurrency.lockutils [req-cc0ac6cc-4874-4f3f-8274-c69f956e7b68 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:42:40.028 134138 DEBUG oslo_concurrency.lockutils [req-cc0ac6cc-4874-4f3f-8274-c69f956e7b68 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:42:40.034 134138 DEBUG oslo_db.sqlalchemy.engines [req-cc0ac6cc-4874-4f3f-8274-c69f956e7b68 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:42:46.050 134145 DEBUG oslo_service.periodic_task [req-840be151-a32e-4610-a932-295f0d75052b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:42:46.051 134145 DEBUG oslo_db.sqlalchemy.engines [req-840be151-a32e-4610-a932-295f0d75052b - - - - -] Parent process 133300 forked (134145) with an open database connection, which is being discarded and recreated. checkout /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:413 2026-04-23 00:42:46.058 134145 DEBUG oslo_concurrency.lockutils [req-840be151-a32e-4610-a932-295f0d75052b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:42:46.059 134145 DEBUG oslo_concurrency.lockutils [req-840be151-a32e-4610-a932-295f0d75052b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:42:46.064 134145 DEBUG oslo_db.sqlalchemy.engines [req-840be151-a32e-4610-a932-295f0d75052b - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:42:47.050 134140 DEBUG oslo_service.periodic_task [req-31b6468d-1246-4243-a06b-fe7c3d3394af - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:42:47.051 134140 DEBUG oslo_db.sqlalchemy.engines [req-31b6468d-1246-4243-a06b-fe7c3d3394af - - - - -] Parent process 133300 forked (134140) with an open database connection, which is being discarded and recreated. checkout /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:413 2026-04-23 00:42:47.058 134140 DEBUG oslo_concurrency.lockutils [req-31b6468d-1246-4243-a06b-fe7c3d3394af - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:42:47.059 134140 DEBUG oslo_concurrency.lockutils [req-31b6468d-1246-4243-a06b-fe7c3d3394af - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:42:47.065 134140 DEBUG oslo_db.sqlalchemy.engines [req-31b6468d-1246-4243-a06b-fe7c3d3394af - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-23 00:43:00.139 134146 DEBUG oslo_service.periodic_task [req-d252ecd0-be21-4122-b3bc-49c0b6a35812 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:43:00.142 134146 DEBUG oslo_concurrency.lockutils [req-d824a527-7eaa-4ed9-951a-38d178a617f8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:43:00.143 134146 DEBUG oslo_concurrency.lockutils [req-d824a527-7eaa-4ed9-951a-38d178a617f8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:43:04.997 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:43:04.998 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:43:04.998 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:43:05.037 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:43:05.038 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:43:05.038 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:43:05.039 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:43:05.039 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:43:05.039 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:43:05.053 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:43:05.053 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:43:05.053 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:43:10.051 134138 DEBUG oslo_service.periodic_task [req-cc0ac6cc-4874-4f3f-8274-c69f956e7b68 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:43:10.056 134138 DEBUG oslo_concurrency.lockutils [req-2ca1fa21-c983-4a7a-b1b9-5e3e5adc14ff - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:43:10.056 134138 DEBUG oslo_concurrency.lockutils [req-2ca1fa21-c983-4a7a-b1b9-5e3e5adc14ff - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:43:16.078 134145 DEBUG oslo_service.periodic_task [req-840be151-a32e-4610-a932-295f0d75052b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:43:16.084 134145 DEBUG oslo_concurrency.lockutils [req-269e8d37-8dec-40ee-bf8b-19d84bd16ffe - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:43:16.085 134145 DEBUG oslo_concurrency.lockutils [req-269e8d37-8dec-40ee-bf8b-19d84bd16ffe - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:43:17.081 134140 DEBUG oslo_service.periodic_task [req-31b6468d-1246-4243-a06b-fe7c3d3394af - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:43:17.085 134140 DEBUG oslo_concurrency.lockutils [req-b2db08b3-3a8b-4bbc-bca9-d996194d4b49 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:43:17.086 134140 DEBUG oslo_concurrency.lockutils [req-b2db08b3-3a8b-4bbc-bca9-d996194d4b49 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:43:30.154 134146 DEBUG oslo_service.periodic_task [req-d824a527-7eaa-4ed9-951a-38d178a617f8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:43:30.158 134146 DEBUG oslo_concurrency.lockutils [req-05c15622-2d61-4cd5-af92-5bb1e06d8590 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:43:30.159 134146 DEBUG oslo_concurrency.lockutils [req-05c15622-2d61-4cd5-af92-5bb1e06d8590 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:43:41.020 134138 DEBUG oslo_service.periodic_task [req-2ca1fa21-c983-4a7a-b1b9-5e3e5adc14ff - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:43:41.024 134138 DEBUG oslo_concurrency.lockutils [req-483c8410-6b82-4395-8fb2-a36e45f9b94f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:43:41.024 134138 DEBUG oslo_concurrency.lockutils [req-483c8410-6b82-4395-8fb2-a36e45f9b94f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:43:46.092 134145 DEBUG oslo_service.periodic_task [req-269e8d37-8dec-40ee-bf8b-19d84bd16ffe - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:43:46.096 134145 DEBUG oslo_concurrency.lockutils [req-a16f5b24-b5e6-4940-b131-e0ae87011d7b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:43:46.096 134145 DEBUG oslo_concurrency.lockutils [req-a16f5b24-b5e6-4940-b131-e0ae87011d7b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:43:47.094 134140 DEBUG oslo_service.periodic_task [req-b2db08b3-3a8b-4bbc-bca9-d996194d4b49 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:43:47.098 134140 DEBUG oslo_concurrency.lockutils [req-f2a33b13-12a9-42ea-b44d-d42a77b10269 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:43:47.098 134140 DEBUG oslo_concurrency.lockutils [req-f2a33b13-12a9-42ea-b44d-d42a77b10269 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:00.168 134146 DEBUG oslo_service.periodic_task [req-05c15622-2d61-4cd5-af92-5bb1e06d8590 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:44:00.173 134146 DEBUG oslo_concurrency.lockutils [req-3f07738a-651a-4e39-a612-7ac9acf35646 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:44:00.173 134146 DEBUG oslo_concurrency.lockutils [req-3f07738a-651a-4e39-a612-7ac9acf35646 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:09.004 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:44:09.004 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:09.005 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:09.045 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:44:09.045 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:44:09.045 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:09.045 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:09.045 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:09.045 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:09.058 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:44:09.058 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:09.058 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:11.031 134138 DEBUG oslo_service.periodic_task [req-483c8410-6b82-4395-8fb2-a36e45f9b94f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:44:11.035 134138 DEBUG oslo_concurrency.lockutils [req-e76e655a-7237-44b1-b1fd-e01d06748e45 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:44:11.036 134138 DEBUG oslo_concurrency.lockutils [req-e76e655a-7237-44b1-b1fd-e01d06748e45 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:16.102 134145 DEBUG oslo_service.periodic_task [req-a16f5b24-b5e6-4940-b131-e0ae87011d7b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:44:16.107 134145 DEBUG oslo_concurrency.lockutils [req-4b4413dd-0439-4106-9a80-579f71199c4a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:44:16.108 134145 DEBUG oslo_concurrency.lockutils [req-4b4413dd-0439-4106-9a80-579f71199c4a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:17.107 134140 DEBUG oslo_service.periodic_task [req-f2a33b13-12a9-42ea-b44d-d42a77b10269 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:44:17.111 134140 DEBUG oslo_concurrency.lockutils [req-e851136d-05e3-40f8-8405-5189000f5d3e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:44:17.111 134140 DEBUG oslo_concurrency.lockutils [req-e851136d-05e3-40f8-8405-5189000f5d3e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:30.183 134146 DEBUG oslo_service.periodic_task [req-3f07738a-651a-4e39-a612-7ac9acf35646 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:44:30.187 134146 DEBUG oslo_concurrency.lockutils [req-b165b758-222d-4e01-873c-d6a6a944fbed - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:44:30.187 134146 DEBUG oslo_concurrency.lockutils [req-b165b758-222d-4e01-873c-d6a6a944fbed - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:42.019 134138 DEBUG oslo_service.periodic_task [req-e76e655a-7237-44b1-b1fd-e01d06748e45 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:44:42.023 134138 DEBUG oslo_concurrency.lockutils [req-e848041b-464b-4d8a-bdcc-1460d45d9408 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:44:42.024 134138 DEBUG oslo_concurrency.lockutils [req-e848041b-464b-4d8a-bdcc-1460d45d9408 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:46.114 134145 DEBUG oslo_service.periodic_task [req-4b4413dd-0439-4106-9a80-579f71199c4a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:44:46.118 134145 DEBUG oslo_concurrency.lockutils [req-95f44083-f906-40ad-ac83-e419927cab3d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:44:46.118 134145 DEBUG oslo_concurrency.lockutils [req-95f44083-f906-40ad-ac83-e419927cab3d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:47.119 134140 DEBUG oslo_service.periodic_task [req-e851136d-05e3-40f8-8405-5189000f5d3e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:44:47.124 134140 DEBUG oslo_concurrency.lockutils [req-09c8a079-0515-473b-b0ed-4162af1102ec - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:44:47.125 134140 DEBUG oslo_concurrency.lockutils [req-09c8a079-0515-473b-b0ed-4162af1102ec - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:52.995 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7772682ab2414b279a358fd742b836fe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:44:52.995 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7772682ab2414b279a358fd742b836fe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:44:52.995 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7772682ab2414b279a358fd742b836fe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:44:52.996 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:52.996 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:52.996 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:52.996 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7772682ab2414b279a358fd742b836fe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:44:52.996 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7772682ab2414b279a358fd742b836fe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:44:52.996 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7772682ab2414b279a358fd742b836fe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:44:52.996 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7772682ab2414b279a358fd742b836fe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:44:52.996 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:52.996 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:52.996 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:52.996 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:52.996 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:52.996 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:52.996 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7772682ab2414b279a358fd742b836fe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:44:52.996 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:52.996 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:52.997 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:52.997 134145 DEBUG oslo_concurrency.lockutils [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:44:52.997 134138 DEBUG oslo_concurrency.lockutils [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:44:52.997 134140 DEBUG oslo_concurrency.lockutils [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:44:52.997 134146 DEBUG oslo_concurrency.lockutils [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:44:52.998 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:44:52.998 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:44:52.998 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:44:52.998 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:52.999 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:52.999 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:52.999 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:52.999 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:52.999 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:52.999 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:44:52.999 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:52.999 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:53.003 134146 DEBUG oslo_concurrency.lockutils [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:44:53.004 134146 DEBUG oslo_concurrency.lockutils [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:53.006 134138 DEBUG oslo_concurrency.lockutils [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:44:53.006 134138 DEBUG oslo_concurrency.lockutils [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:53.008 134140 DEBUG oslo_concurrency.lockutils [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:44:53.008 134140 DEBUG oslo_concurrency.lockutils [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:53.008 134145 DEBUG oslo_concurrency.lockutils [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:44:53.008 134145 DEBUG oslo_concurrency.lockutils [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:53.009 134146 INFO nova.scheduler.host_manager [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Received a sync request from an unknown host 'cn-jenkins-deploy-platform-juju-os-697-1'. Re-created its InstanceList. 2026-04-23 00:44:53.009 134146 DEBUG oslo_concurrency.lockutils [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.012s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:53.010 134138 INFO nova.scheduler.host_manager [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Received a sync request from an unknown host 'cn-jenkins-deploy-platform-juju-os-697-1'. Re-created its InstanceList. 2026-04-23 00:44:53.010 134138 DEBUG oslo_concurrency.lockutils [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.013s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:53.012 134145 INFO nova.scheduler.host_manager [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Received a sync request from an unknown host 'cn-jenkins-deploy-platform-juju-os-697-1'. Re-created its InstanceList. 2026-04-23 00:44:53.012 134145 DEBUG oslo_concurrency.lockutils [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.016s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:53.014 134140 INFO nova.scheduler.host_manager [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Received a sync request from an unknown host 'cn-jenkins-deploy-platform-juju-os-697-1'. Re-created its InstanceList. 2026-04-23 00:44:53.015 134140 DEBUG oslo_concurrency.lockutils [req-37395d4b-a1ec-440f-bd6f-ed45ac2924ba - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.018s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:44:54.000 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:44:54.000 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:54.001 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:54.001 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:44:54.001 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:44:54.001 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:54.001 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:54.001 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:54.001 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:54.002 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:44:54.002 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:54.002 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:56.004 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:44:56.004 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:56.004 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:44:56.004 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:56.004 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:56.004 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:44:56.004 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:56.004 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:44:56.004 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:56.005 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:44:56.005 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:44:56.005 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:45:00.006 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:45:00.006 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:45:00.007 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:45:00.006 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:45:00.007 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:45:00.007 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:45:00.007 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:45:00.007 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:45:00.008 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:45:00.009 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:45:00.009 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:45:00.009 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:45:01.069 134146 DEBUG oslo_service.periodic_task [req-b165b758-222d-4e01-873c-d6a6a944fbed - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:45:01.075 134146 DEBUG oslo_concurrency.lockutils [req-defe3b95-28c8-4b2c-9af1-0df904a9d356 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:45:01.075 134146 DEBUG oslo_concurrency.lockutils [req-defe3b95-28c8-4b2c-9af1-0df904a9d356 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:45:08.012 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:45:08.012 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:45:08.012 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:45:08.012 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:45:08.012 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:45:08.013 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:45:08.013 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:45:08.013 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:45:08.013 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:45:08.015 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:45:08.016 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:45:08.016 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:45:12.033 134138 DEBUG oslo_service.periodic_task [req-e848041b-464b-4d8a-bdcc-1460d45d9408 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:45:12.040 134138 DEBUG oslo_concurrency.lockutils [req-07d9c8d1-03b3-4add-b816-77c64cb7b17e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:45:12.040 134138 DEBUG oslo_concurrency.lockutils [req-07d9c8d1-03b3-4add-b816-77c64cb7b17e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:45:16.125 134145 DEBUG oslo_service.periodic_task [req-95f44083-f906-40ad-ac83-e419927cab3d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:45:16.131 134145 DEBUG oslo_concurrency.lockutils [req-8f584770-253f-4976-b6e6-784e80ccde60 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:45:16.131 134145 DEBUG oslo_concurrency.lockutils [req-8f584770-253f-4976-b6e6-784e80ccde60 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:45:17.133 134140 DEBUG oslo_service.periodic_task [req-09c8a079-0515-473b-b0ed-4162af1102ec - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:45:17.138 134140 DEBUG oslo_concurrency.lockutils [req-3c6985a4-83f5-4105-ae73-b48d201081cc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:45:17.138 134140 DEBUG oslo_concurrency.lockutils [req-3c6985a4-83f5-4105-ae73-b48d201081cc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:45:24.013 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:45:24.014 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:45:24.014 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:45:24.015 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:45:24.015 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:45:24.015 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:45:24.015 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:45:24.015 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:45:24.015 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:45:24.017 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:45:24.017 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:45:24.017 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:45:31.088 134146 DEBUG oslo_service.periodic_task [req-defe3b95-28c8-4b2c-9af1-0df904a9d356 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:45:31.093 134146 DEBUG oslo_concurrency.lockutils [req-a813d02f-68e9-4493-9e8d-96a245f7c23e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:45:31.093 134146 DEBUG oslo_concurrency.lockutils [req-a813d02f-68e9-4493-9e8d-96a245f7c23e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:45:42.049 134138 DEBUG oslo_service.periodic_task [req-07d9c8d1-03b3-4add-b816-77c64cb7b17e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:45:42.054 134138 DEBUG oslo_concurrency.lockutils [req-8d708602-9223-48d9-8393-b473597e99b6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:45:42.054 134138 DEBUG oslo_concurrency.lockutils [req-8d708602-9223-48d9-8393-b473597e99b6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:45:47.052 134145 DEBUG oslo_service.periodic_task [req-8f584770-253f-4976-b6e6-784e80ccde60 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:45:47.056 134145 DEBUG oslo_concurrency.lockutils [req-681341ff-7b37-4b49-9295-0c18e2e04515 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:45:47.057 134145 DEBUG oslo_concurrency.lockutils [req-681341ff-7b37-4b49-9295-0c18e2e04515 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:45:47.145 134140 DEBUG oslo_service.periodic_task [req-3c6985a4-83f5-4105-ae73-b48d201081cc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:45:47.149 134140 DEBUG oslo_concurrency.lockutils [req-f11b1984-9eb3-4c49-9640-00b9065f8372 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:45:47.150 134140 DEBUG oslo_concurrency.lockutils [req-f11b1984-9eb3-4c49-9640-00b9065f8372 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:46:01.103 134146 DEBUG oslo_service.periodic_task [req-a813d02f-68e9-4493-9e8d-96a245f7c23e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:46:01.109 134146 DEBUG oslo_concurrency.lockutils [req-8ea2d5bc-0ed9-4d1b-9fc8-d7b13317a67b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:46:01.109 134146 DEBUG oslo_concurrency.lockutils [req-8ea2d5bc-0ed9-4d1b-9fc8-d7b13317a67b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:46:03.000 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:46:03.000 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:03.000 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:03.034 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:46:03.035 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:03.035 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:03.042 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:46:03.042 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:03.042 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:03.059 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:46:03.059 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:03.059 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:12.066 134138 DEBUG oslo_service.periodic_task [req-8d708602-9223-48d9-8393-b473597e99b6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:46:12.070 134138 DEBUG oslo_concurrency.lockutils [req-4edb48c9-df03-4c49-b840-a9550cec4ddd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:46:12.070 134138 DEBUG oslo_concurrency.lockutils [req-4edb48c9-df03-4c49-b840-a9550cec4ddd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:46:17.065 134145 DEBUG oslo_service.periodic_task [req-681341ff-7b37-4b49-9295-0c18e2e04515 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:46:17.070 134145 DEBUG oslo_concurrency.lockutils [req-539fd405-5745-4c35-ad98-774eb3efdf62 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:46:17.070 134145 DEBUG oslo_concurrency.lockutils [req-539fd405-5745-4c35-ad98-774eb3efdf62 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:46:17.159 134140 DEBUG oslo_service.periodic_task [req-f11b1984-9eb3-4c49-9640-00b9065f8372 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:46:17.163 134140 DEBUG oslo_concurrency.lockutils [req-f76447c3-ad5d-4e93-867a-eb2adfef41e1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:46:17.164 134140 DEBUG oslo_concurrency.lockutils [req-f76447c3-ad5d-4e93-867a-eb2adfef41e1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:46:31.121 134146 DEBUG oslo_service.periodic_task [req-8ea2d5bc-0ed9-4d1b-9fc8-d7b13317a67b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:46:31.125 134146 DEBUG oslo_concurrency.lockutils [req-0d554a4d-b1e3-42a5-8415-cfc1a64012d1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:46:31.125 134146 DEBUG oslo_concurrency.lockutils [req-0d554a4d-b1e3-42a5-8415-cfc1a64012d1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:46:42.085 134138 DEBUG oslo_service.periodic_task [req-4edb48c9-df03-4c49-b840-a9550cec4ddd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:46:42.089 134138 DEBUG oslo_concurrency.lockutils [req-0b0e4554-d84a-4db0-b781-16bf3614ba65 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:46:42.089 134138 DEBUG oslo_concurrency.lockutils [req-0b0e4554-d84a-4db0-b781-16bf3614ba65 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:46:47.076 134145 DEBUG oslo_service.periodic_task [req-539fd405-5745-4c35-ad98-774eb3efdf62 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:46:47.080 134145 DEBUG oslo_concurrency.lockutils [req-a319f821-dc04-49a7-9c2b-bc52a9077673 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:46:47.080 134145 DEBUG oslo_concurrency.lockutils [req-a319f821-dc04-49a7-9c2b-bc52a9077673 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:46:47.172 134140 DEBUG oslo_service.periodic_task [req-f76447c3-ad5d-4e93-867a-eb2adfef41e1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:46:47.176 134140 DEBUG oslo_concurrency.lockutils [req-ecd104d0-9fd4-4238-b0b9-94ceb1cb3ba0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:46:47.177 134140 DEBUG oslo_concurrency.lockutils [req-ecd104d0-9fd4-4238-b0b9-94ceb1cb3ba0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:46:56.825 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 627c2ad070cc495ea27a1eb518fa1b8f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:46:56.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 627c2ad070cc495ea27a1eb518fa1b8f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:46:56.825 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 627c2ad070cc495ea27a1eb518fa1b8f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:46:56.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:56.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:56.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:56.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 627c2ad070cc495ea27a1eb518fa1b8f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:46:56.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 627c2ad070cc495ea27a1eb518fa1b8f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:46:56.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 627c2ad070cc495ea27a1eb518fa1b8f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:46:56.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:56.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 627c2ad070cc495ea27a1eb518fa1b8f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:46:56.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:56.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:56.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:56.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:56.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:56.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:56.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 627c2ad070cc495ea27a1eb518fa1b8f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:46:56.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:56.827 134146 DEBUG oslo_concurrency.lockutils [req-59908cfe-bf30-4631-baa3-ee0ec9ac43cc - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:46:56.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:56.827 134140 DEBUG oslo_concurrency.lockutils [req-59908cfe-bf30-4631-baa3-ee0ec9ac43cc - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:46:56.827 134146 DEBUG nova.scheduler.host_manager [req-59908cfe-bf30-4631-baa3-ee0ec9ac43cc - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:46:56.827 134140 DEBUG nova.scheduler.host_manager [req-59908cfe-bf30-4631-baa3-ee0ec9ac43cc - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:46:56.827 134146 DEBUG oslo_concurrency.lockutils [req-59908cfe-bf30-4631-baa3-ee0ec9ac43cc - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:46:56.827 134145 DEBUG oslo_concurrency.lockutils [req-59908cfe-bf30-4631-baa3-ee0ec9ac43cc - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:46:56.827 134140 DEBUG oslo_concurrency.lockutils [req-59908cfe-bf30-4631-baa3-ee0ec9ac43cc - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:46:56.827 134138 DEBUG oslo_concurrency.lockutils [req-59908cfe-bf30-4631-baa3-ee0ec9ac43cc - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:46:56.827 134145 DEBUG nova.scheduler.host_manager [req-59908cfe-bf30-4631-baa3-ee0ec9ac43cc - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:46:56.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:46:56.828 134138 DEBUG nova.scheduler.host_manager [req-59908cfe-bf30-4631-baa3-ee0ec9ac43cc - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:46:56.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:56.828 134145 DEBUG oslo_concurrency.lockutils [req-59908cfe-bf30-4631-baa3-ee0ec9ac43cc - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:46:56.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:56.828 134138 DEBUG oslo_concurrency.lockutils [req-59908cfe-bf30-4631-baa3-ee0ec9ac43cc - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:46:56.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:46:56.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:56.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:56.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:46:56.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:56.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:56.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:46:56.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:56.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:57.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:46:57.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:57.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:57.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:46:57.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:57.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:57.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:46:57.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:57.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:57.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:46:57.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:57.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:59.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:46:59.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:59.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:59.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:46:59.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:59.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:59.833 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:46:59.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:59.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:46:59.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:46:59.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:46:59.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:47:01.131 134146 DEBUG oslo_service.periodic_task [req-0d554a4d-b1e3-42a5-8415-cfc1a64012d1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:47:01.136 134146 DEBUG oslo_concurrency.lockutils [req-001adc4f-3cfc-45c5-8e54-4fdf2fdc519e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:47:01.136 134146 DEBUG oslo_concurrency.lockutils [req-001adc4f-3cfc-45c5-8e54-4fdf2fdc519e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:47:03.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:47:03.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:47:03.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:47:03.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:47:03.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:47:03.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:47:03.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:47:03.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:47:03.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:47:03.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:47:03.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:47:03.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:47:11.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:47:11.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:47:11.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:47:11.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:47:11.843 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:47:11.843 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:47:11.844 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:47:11.844 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:47:11.844 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:47:11.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:47:11.845 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:47:11.845 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:47:12.095 134138 DEBUG oslo_service.periodic_task [req-0b0e4554-d84a-4db0-b781-16bf3614ba65 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:47:12.099 134138 DEBUG oslo_concurrency.lockutils [req-afe5cf3e-925e-407a-99a6-f6bd73cbe0bb - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:47:12.099 134138 DEBUG oslo_concurrency.lockutils [req-afe5cf3e-925e-407a-99a6-f6bd73cbe0bb - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:47:17.087 134145 DEBUG oslo_service.periodic_task [req-a319f821-dc04-49a7-9c2b-bc52a9077673 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:47:17.091 134145 DEBUG oslo_concurrency.lockutils [req-71ddfab1-7470-41db-9ee7-4872d6b25911 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:47:17.091 134145 DEBUG oslo_concurrency.lockutils [req-71ddfab1-7470-41db-9ee7-4872d6b25911 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:47:17.185 134140 DEBUG oslo_service.periodic_task [req-ecd104d0-9fd4-4238-b0b9-94ceb1cb3ba0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:47:17.189 134140 DEBUG oslo_concurrency.lockutils [req-f73c909a-6fe1-427d-8d4c-30171fd9194a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:47:17.189 134140 DEBUG oslo_concurrency.lockutils [req-f73c909a-6fe1-427d-8d4c-30171fd9194a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:47:27.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:47:27.844 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:47:27.844 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:47:27.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:47:27.845 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:47:27.845 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:47:27.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:47:27.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:47:27.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:47:27.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:47:27.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:47:27.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:47:31.144 134146 DEBUG oslo_service.periodic_task [req-001adc4f-3cfc-45c5-8e54-4fdf2fdc519e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:47:31.148 134146 DEBUG oslo_concurrency.lockutils [req-14ee511c-2636-4a4a-98c6-199e32905dcf - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:47:31.148 134146 DEBUG oslo_concurrency.lockutils [req-14ee511c-2636-4a4a-98c6-199e32905dcf - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:47:42.110 134138 DEBUG oslo_service.periodic_task [req-afe5cf3e-925e-407a-99a6-f6bd73cbe0bb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:47:42.114 134138 DEBUG oslo_concurrency.lockutils [req-afda40e3-d31f-493f-b218-62fcc9b33a61 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:47:42.115 134138 DEBUG oslo_concurrency.lockutils [req-afda40e3-d31f-493f-b218-62fcc9b33a61 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:47:47.098 134145 DEBUG oslo_service.periodic_task [req-71ddfab1-7470-41db-9ee7-4872d6b25911 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:47:47.103 134145 DEBUG oslo_concurrency.lockutils [req-e43ef21c-4365-4629-b702-d143dcdb29d8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:47:47.104 134145 DEBUG oslo_concurrency.lockutils [req-e43ef21c-4365-4629-b702-d143dcdb29d8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:47:47.195 134140 DEBUG oslo_service.periodic_task [req-f73c909a-6fe1-427d-8d4c-30171fd9194a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:47:47.200 134140 DEBUG oslo_concurrency.lockutils [req-dfa03481-4555-463a-853b-027ec16e8e7a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:47:47.200 134140 DEBUG oslo_concurrency.lockutils [req-dfa03481-4555-463a-853b-027ec16e8e7a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:48:02.070 134146 DEBUG oslo_service.periodic_task [req-14ee511c-2636-4a4a-98c6-199e32905dcf - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:48:02.074 134146 DEBUG oslo_concurrency.lockutils [req-3303ac6a-eb3f-4166-99bc-11ab25586baf - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:48:02.074 134146 DEBUG oslo_concurrency.lockutils [req-3303ac6a-eb3f-4166-99bc-11ab25586baf - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:48:03.000 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:48:03.001 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:48:03.001 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:48:03.038 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:48:03.038 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:48:03.038 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:48:03.042 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:48:03.042 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:48:03.042 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:48:03.062 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:48:03.062 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:48:03.062 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:48:13.020 134138 DEBUG oslo_service.periodic_task [req-afda40e3-d31f-493f-b218-62fcc9b33a61 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:48:13.024 134138 DEBUG oslo_concurrency.lockutils [req-033f1148-246c-48f2-8ff4-fc0be4a3b76f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:48:13.024 134138 DEBUG oslo_concurrency.lockutils [req-033f1148-246c-48f2-8ff4-fc0be4a3b76f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:48:17.208 134140 DEBUG oslo_service.periodic_task [req-dfa03481-4555-463a-853b-027ec16e8e7a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:48:17.212 134140 DEBUG oslo_concurrency.lockutils [req-0002ead4-4efa-48b0-8bd3-52d861d8899b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:48:17.212 134140 DEBUG oslo_concurrency.lockutils [req-0002ead4-4efa-48b0-8bd3-52d861d8899b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:48:18.052 134145 DEBUG oslo_service.periodic_task [req-e43ef21c-4365-4629-b702-d143dcdb29d8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:48:18.056 134145 DEBUG oslo_concurrency.lockutils [req-a826734f-73e4-4cdb-aefb-a2fe4a5d97eb - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:48:18.056 134145 DEBUG oslo_concurrency.lockutils [req-a826734f-73e4-4cdb-aefb-a2fe4a5d97eb - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:48:32.086 134146 DEBUG oslo_service.periodic_task [req-3303ac6a-eb3f-4166-99bc-11ab25586baf - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:48:32.090 134146 DEBUG oslo_concurrency.lockutils [req-2ab1db20-dcbf-40c7-9a02-18c570f7d885 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:48:32.090 134146 DEBUG oslo_concurrency.lockutils [req-2ab1db20-dcbf-40c7-9a02-18c570f7d885 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:48:44.020 134138 DEBUG oslo_service.periodic_task [req-033f1148-246c-48f2-8ff4-fc0be4a3b76f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:48:44.024 134138 DEBUG oslo_concurrency.lockutils [req-89b9ec9b-2b91-48b3-974f-bdee75124831 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:48:44.024 134138 DEBUG oslo_concurrency.lockutils [req-89b9ec9b-2b91-48b3-974f-bdee75124831 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:48:48.052 134140 DEBUG oslo_service.periodic_task [req-0002ead4-4efa-48b0-8bd3-52d861d8899b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:48:48.056 134140 DEBUG oslo_concurrency.lockutils [req-5721e3a4-b3de-4464-94f6-c405f817bcb3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:48:48.056 134140 DEBUG oslo_concurrency.lockutils [req-5721e3a4-b3de-4464-94f6-c405f817bcb3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:48:48.061 134145 DEBUG oslo_service.periodic_task [req-a826734f-73e4-4cdb-aefb-a2fe4a5d97eb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:48:48.065 134145 DEBUG oslo_concurrency.lockutils [req-3a4dd73d-ba49-4823-9dff-fd5e030b7b52 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:48:48.065 134145 DEBUG oslo_concurrency.lockutils [req-3a4dd73d-ba49-4823-9dff-fd5e030b7b52 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:49:00.447 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d322115337f042aea29032e419320ec7 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:49:00.447 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d322115337f042aea29032e419320ec7 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:49:00.447 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d322115337f042aea29032e419320ec7 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:49:00.447 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d322115337f042aea29032e419320ec7 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:49:00.447 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:00.447 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:00.447 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:00.447 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:00.448 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d322115337f042aea29032e419320ec7 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:49:00.448 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d322115337f042aea29032e419320ec7 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:49:00.448 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d322115337f042aea29032e419320ec7 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:49:00.448 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d322115337f042aea29032e419320ec7 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:49:00.448 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:00.448 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:00.448 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:00.448 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:00.448 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:00.448 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:00.448 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:00.448 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:00.448 134138 DEBUG oslo_concurrency.lockutils [req-aea9e701-2be6-4a6c-ad55-56311d8a8f5b - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:49:00.448 134145 DEBUG oslo_concurrency.lockutils [req-aea9e701-2be6-4a6c-ad55-56311d8a8f5b - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:49:00.448 134140 DEBUG oslo_concurrency.lockutils [req-aea9e701-2be6-4a6c-ad55-56311d8a8f5b - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:49:00.448 134146 DEBUG oslo_concurrency.lockutils [req-aea9e701-2be6-4a6c-ad55-56311d8a8f5b - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:49:00.449 134138 DEBUG nova.scheduler.host_manager [req-aea9e701-2be6-4a6c-ad55-56311d8a8f5b - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:49:00.449 134145 DEBUG nova.scheduler.host_manager [req-aea9e701-2be6-4a6c-ad55-56311d8a8f5b - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:49:00.449 134140 DEBUG nova.scheduler.host_manager [req-aea9e701-2be6-4a6c-ad55-56311d8a8f5b - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:49:00.449 134146 DEBUG nova.scheduler.host_manager [req-aea9e701-2be6-4a6c-ad55-56311d8a8f5b - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:49:00.449 134138 DEBUG oslo_concurrency.lockutils [req-aea9e701-2be6-4a6c-ad55-56311d8a8f5b - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:49:00.449 134145 DEBUG oslo_concurrency.lockutils [req-aea9e701-2be6-4a6c-ad55-56311d8a8f5b - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:49:00.449 134140 DEBUG oslo_concurrency.lockutils [req-aea9e701-2be6-4a6c-ad55-56311d8a8f5b - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:49:00.449 134146 DEBUG oslo_concurrency.lockutils [req-aea9e701-2be6-4a6c-ad55-56311d8a8f5b - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:49:00.449 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:00.449 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:00.449 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:00.449 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:00.449 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:00.450 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:00.450 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:00.450 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:00.450 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:00.451 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:00.451 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:00.451 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:01.451 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:01.451 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:01.451 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:01.451 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:01.451 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:01.451 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:01.452 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:01.452 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:01.452 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:01.452 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:01.453 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:01.453 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:02.095 134146 DEBUG oslo_service.periodic_task [req-2ab1db20-dcbf-40c7-9a02-18c570f7d885 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:49:02.100 134146 DEBUG oslo_concurrency.lockutils [req-10d2678a-9ed9-4d60-a94d-b919f8918f88 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:49:02.100 134146 DEBUG oslo_concurrency.lockutils [req-10d2678a-9ed9-4d60-a94d-b919f8918f88 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:49:03.452 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:03.452 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:03.453 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:03.452 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:03.453 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:03.453 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:03.453 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:03.453 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:03.453 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:03.454 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:03.454 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:03.454 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:07.456 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:07.456 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:07.456 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:07.457 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:07.457 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:07.457 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:07.457 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:07.458 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:07.458 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:07.458 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:07.458 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:07.459 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:15.020 134138 DEBUG oslo_service.periodic_task [req-89b9ec9b-2b91-48b3-974f-bdee75124831 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:49:15.024 134138 DEBUG oslo_concurrency.lockutils [req-73b543a2-89ab-4e7a-b45b-fb17d1b54112 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:49:15.024 134138 DEBUG oslo_concurrency.lockutils [req-73b543a2-89ab-4e7a-b45b-fb17d1b54112 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:49:15.458 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:15.458 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:15.458 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:15.459 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:15.459 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:15.460 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:15.460 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:15.460 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:15.460 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:15.460 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:15.460 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:15.461 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:18.063 134140 DEBUG oslo_service.periodic_task [req-5721e3a4-b3de-4464-94f6-c405f817bcb3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:49:18.068 134140 DEBUG oslo_concurrency.lockutils [req-8ffa6bde-d289-4683-a9fd-f3c01f97d38f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:49:18.068 134140 DEBUG oslo_concurrency.lockutils [req-8ffa6bde-d289-4683-a9fd-f3c01f97d38f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:49:19.052 134145 DEBUG oslo_service.periodic_task [req-3a4dd73d-ba49-4823-9dff-fd5e030b7b52 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:49:19.056 134145 DEBUG oslo_concurrency.lockutils [req-ec546b56-aac0-423f-bbf0-bb9d34ea7053 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:49:19.056 134145 DEBUG oslo_concurrency.lockutils [req-ec546b56-aac0-423f-bbf0-bb9d34ea7053 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:49:31.460 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:31.460 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:31.461 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:31.461 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:31.461 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:31.461 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:31.462 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:31.462 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:31.463 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:31.463 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:49:31.463 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:49:31.463 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:49:32.105 134146 DEBUG oslo_service.periodic_task [req-10d2678a-9ed9-4d60-a94d-b919f8918f88 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:49:32.109 134146 DEBUG oslo_concurrency.lockutils [req-8c80365b-946a-481e-9cae-b5e2ab9b2f18 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:49:32.109 134146 DEBUG oslo_concurrency.lockutils [req-8c80365b-946a-481e-9cae-b5e2ab9b2f18 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:49:46.020 134138 DEBUG oslo_service.periodic_task [req-73b543a2-89ab-4e7a-b45b-fb17d1b54112 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:49:46.023 134138 DEBUG oslo_concurrency.lockutils [req-9dd9aef6-7460-4259-8763-f477559547ac - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:49:46.024 134138 DEBUG oslo_concurrency.lockutils [req-9dd9aef6-7460-4259-8763-f477559547ac - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:49:48.074 134140 DEBUG oslo_service.periodic_task [req-8ffa6bde-d289-4683-a9fd-f3c01f97d38f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:49:48.078 134140 DEBUG oslo_concurrency.lockutils [req-73b8535b-ad10-4d79-a8c3-5e901e736599 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:49:48.078 134140 DEBUG oslo_concurrency.lockutils [req-73b8535b-ad10-4d79-a8c3-5e901e736599 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:49:50.052 134145 DEBUG oslo_service.periodic_task [req-ec546b56-aac0-423f-bbf0-bb9d34ea7053 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:49:50.056 134145 DEBUG oslo_concurrency.lockutils [req-3aadf848-147a-402f-b674-29e2167eff2f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:49:50.056 134145 DEBUG oslo_concurrency.lockutils [req-3aadf848-147a-402f-b674-29e2167eff2f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:50:03.069 134146 DEBUG oslo_service.periodic_task [req-8c80365b-946a-481e-9cae-b5e2ab9b2f18 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:50:03.073 134146 DEBUG oslo_concurrency.lockutils [req-25630dba-84fa-4f38-8aaa-719da5a80f14 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:50:03.073 134146 DEBUG oslo_concurrency.lockutils [req-25630dba-84fa-4f38-8aaa-719da5a80f14 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:50:03.461 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:50:03.462 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:50:03.462 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:50:03.462 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:50:03.463 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:50:03.463 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:50:03.464 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:50:03.464 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:50:03.464 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:50:03.465 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:50:03.465 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:50:03.465 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:50:16.031 134138 DEBUG oslo_service.periodic_task [req-9dd9aef6-7460-4259-8763-f477559547ac - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:50:16.035 134138 DEBUG oslo_concurrency.lockutils [req-41ea9a63-c989-4fc7-aae2-44a2ac8c8ee3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:50:16.035 134138 DEBUG oslo_concurrency.lockutils [req-41ea9a63-c989-4fc7-aae2-44a2ac8c8ee3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:50:19.051 134140 DEBUG oslo_service.periodic_task [req-73b8535b-ad10-4d79-a8c3-5e901e736599 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:50:19.055 134140 DEBUG oslo_concurrency.lockutils [req-eef37a49-5e4c-4e38-a699-55f9467a7313 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:50:19.056 134140 DEBUG oslo_concurrency.lockutils [req-eef37a49-5e4c-4e38-a699-55f9467a7313 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:50:21.052 134145 DEBUG oslo_service.periodic_task [req-3aadf848-147a-402f-b674-29e2167eff2f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:50:21.056 134145 DEBUG oslo_concurrency.lockutils [req-e857526d-4039-4b39-a972-530eeb40ccf2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:50:21.056 134145 DEBUG oslo_concurrency.lockutils [req-e857526d-4039-4b39-a972-530eeb40ccf2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:50:34.068 134146 DEBUG oslo_service.periodic_task [req-25630dba-84fa-4f38-8aaa-719da5a80f14 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:50:34.072 134146 DEBUG oslo_concurrency.lockutils [req-efbb2a37-ee21-446a-9318-0ce633c18197 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:50:34.073 134146 DEBUG oslo_concurrency.lockutils [req-efbb2a37-ee21-446a-9318-0ce633c18197 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:50:46.042 134138 DEBUG oslo_service.periodic_task [req-41ea9a63-c989-4fc7-aae2-44a2ac8c8ee3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:50:46.046 134138 DEBUG oslo_concurrency.lockutils [req-8a3ffa3d-c134-405d-be1e-4d652a039316 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:50:46.046 134138 DEBUG oslo_concurrency.lockutils [req-8a3ffa3d-c134-405d-be1e-4d652a039316 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:50:49.061 134140 DEBUG oslo_service.periodic_task [req-eef37a49-5e4c-4e38-a699-55f9467a7313 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:50:49.065 134140 DEBUG oslo_concurrency.lockutils [req-1cd6a50d-09f7-4be5-b2bb-113f6c2e7385 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:50:49.065 134140 DEBUG oslo_concurrency.lockutils [req-1cd6a50d-09f7-4be5-b2bb-113f6c2e7385 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:50:51.063 134145 DEBUG oslo_service.periodic_task [req-e857526d-4039-4b39-a972-530eeb40ccf2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:50:51.067 134145 DEBUG oslo_concurrency.lockutils [req-db2cb4ef-7a84-44c7-934f-f620fe64e47a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:50:51.067 134145 DEBUG oslo_concurrency.lockutils [req-db2cb4ef-7a84-44c7-934f-f620fe64e47a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:51:03.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: adbc4e17a35d45b7a85e0c1e37fe6114 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:51:03.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: adbc4e17a35d45b7a85e0c1e37fe6114 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:51:03.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: adbc4e17a35d45b7a85e0c1e37fe6114 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:51:03.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: adbc4e17a35d45b7a85e0c1e37fe6114 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:51:03.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:03.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:03.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: adbc4e17a35d45b7a85e0c1e37fe6114 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:51:03.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: adbc4e17a35d45b7a85e0c1e37fe6114 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:51:03.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:03.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:03.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: adbc4e17a35d45b7a85e0c1e37fe6114 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:51:03.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:03.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:03.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:03.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:03.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: adbc4e17a35d45b7a85e0c1e37fe6114 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:51:03.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:03.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:03.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:03.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:03.829 134140 DEBUG oslo_concurrency.lockutils [req-f98c4df1-f704-48ba-96d4-1ca768df2530 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:51:03.829 134145 DEBUG oslo_concurrency.lockutils [req-f98c4df1-f704-48ba-96d4-1ca768df2530 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:51:03.830 134140 DEBUG nova.scheduler.host_manager [req-f98c4df1-f704-48ba-96d4-1ca768df2530 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:51:03.830 134145 DEBUG nova.scheduler.host_manager [req-f98c4df1-f704-48ba-96d4-1ca768df2530 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:51:03.830 134140 DEBUG oslo_concurrency.lockutils [req-f98c4df1-f704-48ba-96d4-1ca768df2530 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:51:03.830 134145 DEBUG oslo_concurrency.lockutils [req-f98c4df1-f704-48ba-96d4-1ca768df2530 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:51:03.830 134146 DEBUG oslo_concurrency.lockutils [req-f98c4df1-f704-48ba-96d4-1ca768df2530 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:51:03.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:03.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:03.830 134138 DEBUG oslo_concurrency.lockutils [req-f98c4df1-f704-48ba-96d4-1ca768df2530 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:51:03.830 134146 DEBUG nova.scheduler.host_manager [req-f98c4df1-f704-48ba-96d4-1ca768df2530 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:51:03.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:03.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:03.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:03.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:03.830 134138 DEBUG nova.scheduler.host_manager [req-f98c4df1-f704-48ba-96d4-1ca768df2530 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:51:03.830 134146 DEBUG oslo_concurrency.lockutils [req-f98c4df1-f704-48ba-96d4-1ca768df2530 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:51:03.831 134138 DEBUG oslo_concurrency.lockutils [req-f98c4df1-f704-48ba-96d4-1ca768df2530 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:51:03.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:03.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:03.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:03.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:03.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:03.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:04.078 134146 DEBUG oslo_service.periodic_task [req-efbb2a37-ee21-446a-9318-0ce633c18197 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:51:04.082 134146 DEBUG oslo_concurrency.lockutils [req-dec8181f-410a-4df4-b1fc-a9fd34521580 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:51:04.083 134146 DEBUG oslo_concurrency.lockutils [req-dec8181f-410a-4df4-b1fc-a9fd34521580 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:51:04.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:04.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:04.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:04.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:04.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:04.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:04.833 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:04.833 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:04.833 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:04.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:04.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:04.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:06.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:06.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:06.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:06.835 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:06.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:06.835 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:06.835 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:06.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:06.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:06.836 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:06.837 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:06.837 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:10.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:10.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:10.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:10.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:10.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:10.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:10.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:10.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:10.841 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:10.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:10.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:10.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:17.020 134138 DEBUG oslo_service.periodic_task [req-8a3ffa3d-c134-405d-be1e-4d652a039316 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:51:17.025 134138 DEBUG oslo_concurrency.lockutils [req-56d6fade-29c8-42c1-bc73-f30be1239d81 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:51:17.025 134138 DEBUG oslo_concurrency.lockutils [req-56d6fade-29c8-42c1-bc73-f30be1239d81 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:51:18.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:18.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:18.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:18.843 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:18.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:18.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:18.845 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:18.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:18.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:18.848 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:18.848 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:18.848 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:19.069 134140 DEBUG oslo_service.periodic_task [req-1cd6a50d-09f7-4be5-b2bb-113f6c2e7385 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:51:19.073 134140 DEBUG oslo_concurrency.lockutils [req-80534a01-2eef-40ee-8da2-2a943c84375d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:51:19.074 134140 DEBUG oslo_concurrency.lockutils [req-80534a01-2eef-40ee-8da2-2a943c84375d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:51:21.073 134145 DEBUG oslo_service.periodic_task [req-db2cb4ef-7a84-44c7-934f-f620fe64e47a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:51:21.077 134145 DEBUG oslo_concurrency.lockutils [req-9611e10d-6003-4e48-bac7-d41377ab6880 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:51:21.077 134145 DEBUG oslo_concurrency.lockutils [req-9611e10d-6003-4e48-bac7-d41377ab6880 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:51:34.845 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:34.845 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:34.845 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:34.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:34.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:34.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:34.848 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:34.848 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:34.848 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:34.849 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:51:34.850 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:51:34.850 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:51:35.069 134146 DEBUG oslo_service.periodic_task [req-dec8181f-410a-4df4-b1fc-a9fd34521580 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:51:35.073 134146 DEBUG oslo_concurrency.lockutils [req-d04fc794-7246-4bf0-a461-4205c7c569dd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:51:35.073 134146 DEBUG oslo_concurrency.lockutils [req-d04fc794-7246-4bf0-a461-4205c7c569dd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:51:47.032 134138 DEBUG oslo_service.periodic_task [req-56d6fade-29c8-42c1-bc73-f30be1239d81 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:51:47.036 134138 DEBUG oslo_concurrency.lockutils [req-37933aaf-64dc-4b22-ac7e-e8644de2a680 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:51:47.037 134138 DEBUG oslo_concurrency.lockutils [req-37933aaf-64dc-4b22-ac7e-e8644de2a680 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:51:49.080 134140 DEBUG oslo_service.periodic_task [req-80534a01-2eef-40ee-8da2-2a943c84375d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:51:49.084 134140 DEBUG oslo_concurrency.lockutils [req-651a4247-e8d4-4773-a1b7-216fc2139fc3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:51:49.084 134140 DEBUG oslo_concurrency.lockutils [req-651a4247-e8d4-4773-a1b7-216fc2139fc3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:51:51.083 134145 DEBUG oslo_service.periodic_task [req-9611e10d-6003-4e48-bac7-d41377ab6880 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:51:51.087 134145 DEBUG oslo_concurrency.lockutils [req-807b1a48-cf57-4200-b3d7-e886d1753060 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:51:51.087 134145 DEBUG oslo_concurrency.lockutils [req-807b1a48-cf57-4200-b3d7-e886d1753060 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:52:05.079 134146 DEBUG oslo_service.periodic_task [req-d04fc794-7246-4bf0-a461-4205c7c569dd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:52:05.083 134146 DEBUG oslo_concurrency.lockutils [req-9fc4394f-4dcd-4066-8215-f013ba393877 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:52:05.083 134146 DEBUG oslo_concurrency.lockutils [req-9fc4394f-4dcd-4066-8215-f013ba393877 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:52:06.848 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:52:06.849 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:52:06.849 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:52:06.850 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:52:06.850 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:52:06.851 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:52:06.851 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:52:06.852 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:52:06.852 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:52:06.852 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:52:06.852 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:52:06.852 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:52:17.042 134138 DEBUG oslo_service.periodic_task [req-37933aaf-64dc-4b22-ac7e-e8644de2a680 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:52:17.046 134138 DEBUG oslo_concurrency.lockutils [req-624112fb-621f-4089-9bb0-6c672a6bd28a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:52:17.047 134138 DEBUG oslo_concurrency.lockutils [req-624112fb-621f-4089-9bb0-6c672a6bd28a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:52:19.089 134140 DEBUG oslo_service.periodic_task [req-651a4247-e8d4-4773-a1b7-216fc2139fc3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:52:19.095 134140 DEBUG oslo_concurrency.lockutils [req-d2dd49e3-997e-49d0-84e5-5e34d66f9ce2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:52:19.095 134140 DEBUG oslo_concurrency.lockutils [req-d2dd49e3-997e-49d0-84e5-5e34d66f9ce2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:52:21.095 134145 DEBUG oslo_service.periodic_task [req-807b1a48-cf57-4200-b3d7-e886d1753060 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:52:21.099 134145 DEBUG oslo_concurrency.lockutils [req-bfb9d34a-b6df-48ef-a96d-3321e142a107 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:52:21.099 134145 DEBUG oslo_concurrency.lockutils [req-bfb9d34a-b6df-48ef-a96d-3321e142a107 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:52:35.089 134146 DEBUG oslo_service.periodic_task [req-9fc4394f-4dcd-4066-8215-f013ba393877 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:52:35.094 134146 DEBUG oslo_concurrency.lockutils [req-2233570c-402b-4e58-980a-77dab6272471 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:52:35.094 134146 DEBUG oslo_concurrency.lockutils [req-2233570c-402b-4e58-980a-77dab6272471 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:52:48.019 134138 DEBUG oslo_service.periodic_task [req-624112fb-621f-4089-9bb0-6c672a6bd28a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:52:48.023 134138 DEBUG oslo_concurrency.lockutils [req-e845c988-ec41-4560-9c76-8d298c6ccbed - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:52:48.024 134138 DEBUG oslo_concurrency.lockutils [req-e845c988-ec41-4560-9c76-8d298c6ccbed - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:52:49.101 134140 DEBUG oslo_service.periodic_task [req-d2dd49e3-997e-49d0-84e5-5e34d66f9ce2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:52:49.105 134140 DEBUG oslo_concurrency.lockutils [req-cc80e2eb-ebc0-459d-9493-cda1aebdb6d3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:52:49.105 134140 DEBUG oslo_concurrency.lockutils [req-cc80e2eb-ebc0-459d-9493-cda1aebdb6d3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:52:51.106 134145 DEBUG oslo_service.periodic_task [req-bfb9d34a-b6df-48ef-a96d-3321e142a107 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:52:51.110 134145 DEBUG oslo_concurrency.lockutils [req-03a86bb8-d836-433d-9bb9-7770deba2e5d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:52:51.110 134145 DEBUG oslo_concurrency.lockutils [req-03a86bb8-d836-433d-9bb9-7770deba2e5d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:53:05.099 134146 DEBUG oslo_service.periodic_task [req-2233570c-402b-4e58-980a-77dab6272471 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:53:05.104 134146 DEBUG oslo_concurrency.lockutils [req-dfcf557a-b19d-427d-b79a-25c65e497967 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:53:05.104 134146 DEBUG oslo_concurrency.lockutils [req-dfcf557a-b19d-427d-b79a-25c65e497967 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:53:06.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: b04e20c0b6bc40d28c1486ed2048969c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:53:06.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: b04e20c0b6bc40d28c1486ed2048969c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:53:06.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: b04e20c0b6bc40d28c1486ed2048969c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:53:06.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:06.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: b04e20c0b6bc40d28c1486ed2048969c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:53:06.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:06.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:06.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b04e20c0b6bc40d28c1486ed2048969c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:53:06.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b04e20c0b6bc40d28c1486ed2048969c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:53:06.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b04e20c0b6bc40d28c1486ed2048969c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:53:06.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:06.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:06.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:06.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:06.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:06.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:06.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:06.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b04e20c0b6bc40d28c1486ed2048969c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:53:06.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:06.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:06.830 134138 DEBUG oslo_concurrency.lockutils [req-a956600c-110f-4f8e-bf33-263efe225267 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:53:06.830 134140 DEBUG oslo_concurrency.lockutils [req-a956600c-110f-4f8e-bf33-263efe225267 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:53:06.830 134145 DEBUG oslo_concurrency.lockutils [req-a956600c-110f-4f8e-bf33-263efe225267 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:53:06.830 134138 DEBUG nova.scheduler.host_manager [req-a956600c-110f-4f8e-bf33-263efe225267 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:53:06.830 134140 DEBUG nova.scheduler.host_manager [req-a956600c-110f-4f8e-bf33-263efe225267 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:53:06.830 134145 DEBUG nova.scheduler.host_manager [req-a956600c-110f-4f8e-bf33-263efe225267 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:53:06.830 134138 DEBUG oslo_concurrency.lockutils [req-a956600c-110f-4f8e-bf33-263efe225267 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:53:06.830 134145 DEBUG oslo_concurrency.lockutils [req-a956600c-110f-4f8e-bf33-263efe225267 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:53:06.830 134140 DEBUG oslo_concurrency.lockutils [req-a956600c-110f-4f8e-bf33-263efe225267 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:53:06.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:06.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:06.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:06.830 134146 DEBUG oslo_concurrency.lockutils [req-a956600c-110f-4f8e-bf33-263efe225267 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:53:06.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:06.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:06.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:06.831 134146 DEBUG nova.scheduler.host_manager [req-a956600c-110f-4f8e-bf33-263efe225267 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:53:06.831 134146 DEBUG oslo_concurrency.lockutils [req-a956600c-110f-4f8e-bf33-263efe225267 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:53:06.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:06.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:06.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:06.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:06.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:06.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:07.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:07.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:07.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:07.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:07.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:07.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:07.833 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:07.833 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:07.833 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:07.833 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:07.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:07.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:09.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:09.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:09.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:09.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:09.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:09.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:09.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:09.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:09.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:09.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:09.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:09.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:13.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:13.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:13.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:13.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:13.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:13.837 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:13.836 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:13.837 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:13.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:13.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:13.837 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:13.837 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:18.033 134138 DEBUG oslo_service.periodic_task [req-e845c988-ec41-4560-9c76-8d298c6ccbed - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:53:18.037 134138 DEBUG oslo_concurrency.lockutils [req-09a20c8c-9122-49ca-b5df-cf5e92db1a32 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:53:18.038 134138 DEBUG oslo_concurrency.lockutils [req-09a20c8c-9122-49ca-b5df-cf5e92db1a32 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:53:19.114 134140 DEBUG oslo_service.periodic_task [req-cc80e2eb-ebc0-459d-9493-cda1aebdb6d3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:53:19.119 134140 DEBUG oslo_concurrency.lockutils [req-ba96d7c7-3980-4f35-ad9d-b76bcc071a5d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:53:19.119 134140 DEBUG oslo_concurrency.lockutils [req-ba96d7c7-3980-4f35-ad9d-b76bcc071a5d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:53:21.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:21.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:21.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:21.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:21.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:21.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:21.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:21.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:21.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:21.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:21.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:21.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:22.051 134145 DEBUG oslo_service.periodic_task [req-03a86bb8-d836-433d-9bb9-7770deba2e5d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:53:22.056 134145 DEBUG oslo_concurrency.lockutils [req-b7bc4a94-8a84-436e-918c-694f747e1014 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:53:22.056 134145 DEBUG oslo_concurrency.lockutils [req-b7bc4a94-8a84-436e-918c-694f747e1014 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:53:35.111 134146 DEBUG oslo_service.periodic_task [req-dfcf557a-b19d-427d-b79a-25c65e497967 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:53:35.115 134146 DEBUG oslo_concurrency.lockutils [req-2bcb50dc-5357-4e95-aa4b-32245916ecfa - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:53:35.115 134146 DEBUG oslo_concurrency.lockutils [req-2bcb50dc-5357-4e95-aa4b-32245916ecfa - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:53:37.841 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:37.841 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:37.841 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:37.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:37.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:37.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:37.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:37.843 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:37.843 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:37.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:53:37.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:53:37.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:53:48.043 134138 DEBUG oslo_service.periodic_task [req-09a20c8c-9122-49ca-b5df-cf5e92db1a32 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:53:48.047 134138 DEBUG oslo_concurrency.lockutils [req-4ff3d1ff-8d2d-41f4-8049-c4512cc0581c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:53:48.047 134138 DEBUG oslo_concurrency.lockutils [req-4ff3d1ff-8d2d-41f4-8049-c4512cc0581c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:53:50.051 134140 DEBUG oslo_service.periodic_task [req-ba96d7c7-3980-4f35-ad9d-b76bcc071a5d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:53:50.056 134140 DEBUG oslo_concurrency.lockutils [req-38263e5f-3bed-4384-af1a-39e01fe9ba22 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:53:50.056 134140 DEBUG oslo_concurrency.lockutils [req-38263e5f-3bed-4384-af1a-39e01fe9ba22 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:53:52.062 134145 DEBUG oslo_service.periodic_task [req-b7bc4a94-8a84-436e-918c-694f747e1014 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:53:52.066 134145 DEBUG oslo_concurrency.lockutils [req-26f637e2-2c3c-4043-8a78-602bdcd680ef - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:53:52.067 134145 DEBUG oslo_concurrency.lockutils [req-26f637e2-2c3c-4043-8a78-602bdcd680ef - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:54:05.122 134146 DEBUG oslo_service.periodic_task [req-2bcb50dc-5357-4e95-aa4b-32245916ecfa - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:54:05.126 134146 DEBUG oslo_concurrency.lockutils [req-dd788645-2e4a-478c-ae07-987bfb2e73d5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:54:05.126 134146 DEBUG oslo_concurrency.lockutils [req-dd788645-2e4a-478c-ae07-987bfb2e73d5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:54:09.848 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:54:09.848 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:54:09.848 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:54:09.848 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:54:09.848 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:54:09.848 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:54:09.849 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:54:09.848 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:54:09.849 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:54:09.849 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:54:09.849 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:54:09.850 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:54:19.020 134138 DEBUG oslo_service.periodic_task [req-4ff3d1ff-8d2d-41f4-8049-c4512cc0581c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:54:19.024 134138 DEBUG oslo_concurrency.lockutils [req-7201b562-62ca-4d9d-baa8-443f629b9394 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:54:19.024 134138 DEBUG oslo_concurrency.lockutils [req-7201b562-62ca-4d9d-baa8-443f629b9394 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:54:21.052 134140 DEBUG oslo_service.periodic_task [req-38263e5f-3bed-4384-af1a-39e01fe9ba22 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:54:21.055 134140 DEBUG oslo_concurrency.lockutils [req-f721befe-0596-4e72-a5b1-c7cd2e40c493 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:54:21.056 134140 DEBUG oslo_concurrency.lockutils [req-f721befe-0596-4e72-a5b1-c7cd2e40c493 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:54:22.075 134145 DEBUG oslo_service.periodic_task [req-26f637e2-2c3c-4043-8a78-602bdcd680ef - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:54:22.080 134145 DEBUG oslo_concurrency.lockutils [req-e8724b00-f55a-4374-bb28-956c1351d138 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:54:22.080 134145 DEBUG oslo_concurrency.lockutils [req-e8724b00-f55a-4374-bb28-956c1351d138 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:54:36.069 134146 DEBUG oslo_service.periodic_task [req-dd788645-2e4a-478c-ae07-987bfb2e73d5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:54:36.073 134146 DEBUG oslo_concurrency.lockutils [req-4113f4ae-fb97-432a-b397-a879742c4262 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:54:36.074 134146 DEBUG oslo_concurrency.lockutils [req-4113f4ae-fb97-432a-b397-a879742c4262 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:54:49.029 134138 DEBUG oslo_service.periodic_task [req-7201b562-62ca-4d9d-baa8-443f629b9394 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:54:49.033 134138 DEBUG oslo_concurrency.lockutils [req-85c847b7-381d-4d29-9b7b-616967515929 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:54:49.033 134138 DEBUG oslo_concurrency.lockutils [req-85c847b7-381d-4d29-9b7b-616967515929 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:54:51.062 134140 DEBUG oslo_service.periodic_task [req-f721befe-0596-4e72-a5b1-c7cd2e40c493 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:54:51.067 134140 DEBUG oslo_concurrency.lockutils [req-9100b54f-8488-4aca-a3ae-7ace681639e3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:54:51.067 134140 DEBUG oslo_concurrency.lockutils [req-9100b54f-8488-4aca-a3ae-7ace681639e3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:54:52.089 134145 DEBUG oslo_service.periodic_task [req-e8724b00-f55a-4374-bb28-956c1351d138 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:54:52.093 134145 DEBUG oslo_concurrency.lockutils [req-af1c1503-df7e-4283-a68a-4c8f3a8e15ae - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:54:52.094 134145 DEBUG oslo_concurrency.lockutils [req-af1c1503-df7e-4283-a68a-4c8f3a8e15ae - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:55:06.081 134146 DEBUG oslo_service.periodic_task [req-4113f4ae-fb97-432a-b397-a879742c4262 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:55:06.086 134146 DEBUG oslo_concurrency.lockutils [req-75ebd4ae-a6e7-4da5-9e2f-147af1164976 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:55:06.086 134146 DEBUG oslo_concurrency.lockutils [req-75ebd4ae-a6e7-4da5-9e2f-147af1164976 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:55:08.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 276b6cc092974446a74aa49357de266d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:55:08.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 276b6cc092974446a74aa49357de266d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:55:08.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 276b6cc092974446a74aa49357de266d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:55:08.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:08.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 276b6cc092974446a74aa49357de266d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:55:08.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:08.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 276b6cc092974446a74aa49357de266d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:55:08.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:08.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 276b6cc092974446a74aa49357de266d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:55:08.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:08.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:08.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 276b6cc092974446a74aa49357de266d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:55:08.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:08.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 276b6cc092974446a74aa49357de266d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:55:08.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:08.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:08.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:08.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:08.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:08.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:08.828 134145 DEBUG oslo_concurrency.lockutils [req-6f3d7a75-f062-4fec-b2e9-fd5633766b47 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:55:08.828 134140 DEBUG oslo_concurrency.lockutils [req-6f3d7a75-f062-4fec-b2e9-fd5633766b47 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:55:08.828 134145 DEBUG nova.scheduler.host_manager [req-6f3d7a75-f062-4fec-b2e9-fd5633766b47 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:55:08.828 134146 DEBUG oslo_concurrency.lockutils [req-6f3d7a75-f062-4fec-b2e9-fd5633766b47 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:55:08.828 134138 DEBUG oslo_concurrency.lockutils [req-6f3d7a75-f062-4fec-b2e9-fd5633766b47 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:55:08.828 134140 DEBUG nova.scheduler.host_manager [req-6f3d7a75-f062-4fec-b2e9-fd5633766b47 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:55:08.828 134145 DEBUG oslo_concurrency.lockutils [req-6f3d7a75-f062-4fec-b2e9-fd5633766b47 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:55:08.829 134146 DEBUG nova.scheduler.host_manager [req-6f3d7a75-f062-4fec-b2e9-fd5633766b47 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:55:08.829 134138 DEBUG nova.scheduler.host_manager [req-6f3d7a75-f062-4fec-b2e9-fd5633766b47 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:55:08.829 134140 DEBUG oslo_concurrency.lockutils [req-6f3d7a75-f062-4fec-b2e9-fd5633766b47 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:55:08.829 134146 DEBUG oslo_concurrency.lockutils [req-6f3d7a75-f062-4fec-b2e9-fd5633766b47 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:55:08.829 134138 DEBUG oslo_concurrency.lockutils [req-6f3d7a75-f062-4fec-b2e9-fd5633766b47 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:55:08.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:08.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:08.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:08.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:08.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:08.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:08.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:08.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:08.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:08.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:08.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:08.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:09.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:09.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:09.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:09.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:09.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:09.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:09.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:09.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:09.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:09.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:09.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:09.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:11.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:11.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:11.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:11.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:11.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:11.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:11.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:11.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:11.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:11.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:11.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:11.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:15.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:15.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:15.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:15.835 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:15.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:15.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:15.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:15.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:15.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:15.838 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:15.838 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:15.839 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:19.041 134138 DEBUG oslo_service.periodic_task [req-85c847b7-381d-4d29-9b7b-616967515929 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:55:19.045 134138 DEBUG oslo_concurrency.lockutils [req-c138066f-a8c8-4c96-a3e1-6cd387dfe870 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:55:19.045 134138 DEBUG oslo_concurrency.lockutils [req-c138066f-a8c8-4c96-a3e1-6cd387dfe870 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:55:21.076 134140 DEBUG oslo_service.periodic_task [req-9100b54f-8488-4aca-a3ae-7ace681639e3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:55:21.080 134140 DEBUG oslo_concurrency.lockutils [req-d5e6b591-1009-43a9-8894-535016ab330f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:55:21.080 134140 DEBUG oslo_concurrency.lockutils [req-d5e6b591-1009-43a9-8894-535016ab330f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:55:22.102 134145 DEBUG oslo_service.periodic_task [req-af1c1503-df7e-4283-a68a-4c8f3a8e15ae - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:55:22.106 134145 DEBUG oslo_concurrency.lockutils [req-8e5f951c-71f0-4b6a-ada4-275106b9d2fe - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:55:22.106 134145 DEBUG oslo_concurrency.lockutils [req-8e5f951c-71f0-4b6a-ada4-275106b9d2fe - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:55:23.837 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:23.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:23.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:23.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:23.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:23.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:23.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:23.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:23.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:23.840 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:23.840 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:23.840 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:36.093 134146 DEBUG oslo_service.periodic_task [req-75ebd4ae-a6e7-4da5-9e2f-147af1164976 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:55:36.098 134146 DEBUG oslo_concurrency.lockutils [req-14a9193f-5189-4d9f-9340-4d0826237017 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:55:36.098 134146 DEBUG oslo_concurrency.lockutils [req-14a9193f-5189-4d9f-9340-4d0826237017 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:55:39.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:39.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:39.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:39.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:39.841 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:39.841 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:39.841 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:39.841 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:39.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:39.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:55:39.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:55:39.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:55:49.050 134138 DEBUG oslo_service.periodic_task [req-c138066f-a8c8-4c96-a3e1-6cd387dfe870 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:55:49.055 134138 DEBUG oslo_concurrency.lockutils [req-dbc63223-8c93-4383-a98a-1c60576e4374 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:55:49.055 134138 DEBUG oslo_concurrency.lockutils [req-dbc63223-8c93-4383-a98a-1c60576e4374 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:55:52.053 134140 DEBUG oslo_service.periodic_task [req-d5e6b591-1009-43a9-8894-535016ab330f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:55:52.057 134140 DEBUG oslo_concurrency.lockutils [req-944a8447-5849-491d-ba5a-b279e687be60 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:55:52.057 134140 DEBUG oslo_concurrency.lockutils [req-944a8447-5849-491d-ba5a-b279e687be60 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:55:52.114 134145 DEBUG oslo_service.periodic_task [req-8e5f951c-71f0-4b6a-ada4-275106b9d2fe - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:55:52.118 134145 DEBUG oslo_concurrency.lockutils [req-3ad97406-7d19-4ee7-b08a-8fddbc1cc0a0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:55:52.118 134145 DEBUG oslo_concurrency.lockutils [req-3ad97406-7d19-4ee7-b08a-8fddbc1cc0a0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:56:07.069 134146 DEBUG oslo_service.periodic_task [req-14a9193f-5189-4d9f-9340-4d0826237017 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:56:07.073 134146 DEBUG oslo_concurrency.lockutils [req-c635fa20-c10a-4234-8a97-e4a13d6b6966 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:56:07.074 134146 DEBUG oslo_concurrency.lockutils [req-c635fa20-c10a-4234-8a97-e4a13d6b6966 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:56:11.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:56:11.847 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:56:11.847 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:56:11.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:56:11.848 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:56:11.848 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:56:11.848 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:56:11.848 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:56:11.848 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:56:11.850 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:56:11.851 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:56:11.851 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:56:19.061 134138 DEBUG oslo_service.periodic_task [req-dbc63223-8c93-4383-a98a-1c60576e4374 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:56:19.066 134138 DEBUG oslo_concurrency.lockutils [req-479e92c5-8a87-41a3-9b4a-4ddd854da257 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:56:19.066 134138 DEBUG oslo_concurrency.lockutils [req-479e92c5-8a87-41a3-9b4a-4ddd854da257 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:56:22.063 134140 DEBUG oslo_service.periodic_task [req-944a8447-5849-491d-ba5a-b279e687be60 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:56:22.067 134140 DEBUG oslo_concurrency.lockutils [req-2f3ff849-701d-4115-93bd-8d287ccb0e93 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:56:22.067 134140 DEBUG oslo_concurrency.lockutils [req-2f3ff849-701d-4115-93bd-8d287ccb0e93 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:56:22.128 134145 DEBUG oslo_service.periodic_task [req-3ad97406-7d19-4ee7-b08a-8fddbc1cc0a0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:56:22.133 134145 DEBUG oslo_concurrency.lockutils [req-acf1748f-e414-4c1c-8b4a-b7daafbf7a14 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:56:22.133 134145 DEBUG oslo_concurrency.lockutils [req-acf1748f-e414-4c1c-8b4a-b7daafbf7a14 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:56:37.078 134146 DEBUG oslo_service.periodic_task [req-c635fa20-c10a-4234-8a97-e4a13d6b6966 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:56:37.082 134146 DEBUG oslo_concurrency.lockutils [req-fc132f0e-c9fb-452d-ad16-3a17bdd4ba14 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:56:37.082 134146 DEBUG oslo_concurrency.lockutils [req-fc132f0e-c9fb-452d-ad16-3a17bdd4ba14 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:56:49.071 134138 DEBUG oslo_service.periodic_task [req-479e92c5-8a87-41a3-9b4a-4ddd854da257 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:56:49.075 134138 DEBUG oslo_concurrency.lockutils [req-827c93d1-5f05-42fa-bd83-37590c613e14 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:56:49.076 134138 DEBUG oslo_concurrency.lockutils [req-827c93d1-5f05-42fa-bd83-37590c613e14 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:56:52.076 134140 DEBUG oslo_service.periodic_task [req-2f3ff849-701d-4115-93bd-8d287ccb0e93 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:56:52.080 134140 DEBUG oslo_concurrency.lockutils [req-369fb8d6-f3f8-4884-8b80-e1cf364c4fbb - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:56:52.080 134140 DEBUG oslo_concurrency.lockutils [req-369fb8d6-f3f8-4884-8b80-e1cf364c4fbb - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:56:53.052 134145 DEBUG oslo_service.periodic_task [req-acf1748f-e414-4c1c-8b4a-b7daafbf7a14 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:56:53.056 134145 DEBUG oslo_concurrency.lockutils [req-3eb1123e-924e-432c-a26e-2477577048f0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:56:53.057 134145 DEBUG oslo_concurrency.lockutils [req-3eb1123e-924e-432c-a26e-2477577048f0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:57:08.069 134146 DEBUG oslo_service.periodic_task [req-fc132f0e-c9fb-452d-ad16-3a17bdd4ba14 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:57:08.073 134146 DEBUG oslo_concurrency.lockutils [req-59718c99-d100-48b7-b2b3-7662a9935bb0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:57:08.074 134146 DEBUG oslo_concurrency.lockutils [req-59718c99-d100-48b7-b2b3-7662a9935bb0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:57:10.838 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 149da54c0c63408f99af93c04bdfd7d2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:57:10.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 149da54c0c63408f99af93c04bdfd7d2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:57:10.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 149da54c0c63408f99af93c04bdfd7d2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:57:10.838 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:10.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:10.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:10.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 149da54c0c63408f99af93c04bdfd7d2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:57:10.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 149da54c0c63408f99af93c04bdfd7d2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:57:10.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 149da54c0c63408f99af93c04bdfd7d2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:57:10.838 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 149da54c0c63408f99af93c04bdfd7d2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:57:10.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:10.839 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:10.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:10.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 149da54c0c63408f99af93c04bdfd7d2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:57:10.839 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:10.839 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:10.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:10.839 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:10.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:10.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:10.839 134138 DEBUG oslo_concurrency.lockutils [req-31d8ae40-a106-44ad-972c-2a0f61214e03 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:57:10.840 134138 DEBUG nova.scheduler.host_manager [req-31d8ae40-a106-44ad-972c-2a0f61214e03 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:57:10.839 134140 DEBUG oslo_concurrency.lockutils [req-31d8ae40-a106-44ad-972c-2a0f61214e03 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:57:10.839 134146 DEBUG oslo_concurrency.lockutils [req-31d8ae40-a106-44ad-972c-2a0f61214e03 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:57:10.840 134138 DEBUG oslo_concurrency.lockutils [req-31d8ae40-a106-44ad-972c-2a0f61214e03 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:57:10.840 134145 DEBUG oslo_concurrency.lockutils [req-31d8ae40-a106-44ad-972c-2a0f61214e03 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:57:10.840 134145 DEBUG nova.scheduler.host_manager [req-31d8ae40-a106-44ad-972c-2a0f61214e03 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:57:10.840 134146 DEBUG nova.scheduler.host_manager [req-31d8ae40-a106-44ad-972c-2a0f61214e03 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:57:10.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:10.840 134140 DEBUG nova.scheduler.host_manager [req-31d8ae40-a106-44ad-972c-2a0f61214e03 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:57:10.840 134145 DEBUG oslo_concurrency.lockutils [req-31d8ae40-a106-44ad-972c-2a0f61214e03 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:57:10.840 134146 DEBUG oslo_concurrency.lockutils [req-31d8ae40-a106-44ad-972c-2a0f61214e03 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:57:10.841 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:10.840 134140 DEBUG oslo_concurrency.lockutils [req-31d8ae40-a106-44ad-972c-2a0f61214e03 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:57:10.841 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:10.841 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:10.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:10.841 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:10.841 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:10.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:10.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:10.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:10.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:10.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:11.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:11.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:11.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:11.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:11.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:11.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:11.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:11.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:11.843 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:11.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:11.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:11.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:13.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:13.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:13.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:13.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:13.844 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:13.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:13.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:13.844 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:13.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:13.844 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:13.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:13.845 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:17.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:17.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:17.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:17.847 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:17.847 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:17.847 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:17.847 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:17.847 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:17.847 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:17.847 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:17.847 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:17.847 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:19.080 134138 DEBUG oslo_service.periodic_task [req-827c93d1-5f05-42fa-bd83-37590c613e14 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:57:19.085 134138 DEBUG oslo_concurrency.lockutils [req-529df12d-3c12-428c-bfea-e9c22a90a617 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:57:19.085 134138 DEBUG oslo_concurrency.lockutils [req-529df12d-3c12-428c-bfea-e9c22a90a617 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:57:22.088 134140 DEBUG oslo_service.periodic_task [req-369fb8d6-f3f8-4884-8b80-e1cf364c4fbb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:57:22.093 134140 DEBUG oslo_concurrency.lockutils [req-181189a4-bd9c-4aa4-83b6-227556283bd9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:57:22.093 134140 DEBUG oslo_concurrency.lockutils [req-181189a4-bd9c-4aa4-83b6-227556283bd9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:57:24.052 134145 DEBUG oslo_service.periodic_task [req-3eb1123e-924e-432c-a26e-2477577048f0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:57:24.056 134145 DEBUG oslo_concurrency.lockutils [req-ecc0ab1d-44e0-412c-a082-3d80df76c04b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:57:24.056 134145 DEBUG oslo_concurrency.lockutils [req-ecc0ab1d-44e0-412c-a082-3d80df76c04b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:57:25.850 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:25.850 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:25.850 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:25.850 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:25.850 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:25.850 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:25.850 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:25.850 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:25.850 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:25.850 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:25.850 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:25.850 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:38.083 134146 DEBUG oslo_service.periodic_task [req-59718c99-d100-48b7-b2b3-7662a9935bb0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:57:38.087 134146 DEBUG oslo_concurrency.lockutils [req-c852094f-e4a9-4afa-9d84-5cec804d271a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:57:38.087 134146 DEBUG oslo_concurrency.lockutils [req-c852094f-e4a9-4afa-9d84-5cec804d271a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:57:41.852 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:41.852 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:41.852 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:41.853 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:41.853 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:41.853 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:57:41.853 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:41.853 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:41.853 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:41.853 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:41.853 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:57:41.853 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:57:50.020 134138 DEBUG oslo_service.periodic_task [req-529df12d-3c12-428c-bfea-e9c22a90a617 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:57:50.025 134138 DEBUG oslo_concurrency.lockutils [req-1400c384-96cb-4fb8-856f-c38e80c577a7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:57:50.025 134138 DEBUG oslo_concurrency.lockutils [req-1400c384-96cb-4fb8-856f-c38e80c577a7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:57:52.101 134140 DEBUG oslo_service.periodic_task [req-181189a4-bd9c-4aa4-83b6-227556283bd9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:57:52.106 134140 DEBUG oslo_concurrency.lockutils [req-17487fc6-3494-4303-873d-b9186d1672fb - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:57:52.106 134140 DEBUG oslo_concurrency.lockutils [req-17487fc6-3494-4303-873d-b9186d1672fb - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:57:54.061 134145 DEBUG oslo_service.periodic_task [req-ecc0ab1d-44e0-412c-a082-3d80df76c04b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:57:54.066 134145 DEBUG oslo_concurrency.lockutils [req-4731e1dd-3fd1-43f7-ac00-1f0c639a9540 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:57:54.066 134145 DEBUG oslo_concurrency.lockutils [req-4731e1dd-3fd1-43f7-ac00-1f0c639a9540 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:58:09.070 134146 DEBUG oslo_service.periodic_task [req-c852094f-e4a9-4afa-9d84-5cec804d271a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:58:09.074 134146 DEBUG oslo_concurrency.lockutils [req-1ef46eae-615d-44de-879b-03e625b7954d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:58:09.075 134146 DEBUG oslo_concurrency.lockutils [req-1ef46eae-615d-44de-879b-03e625b7954d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:58:13.853 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:58:13.853 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:58:13.854 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:58:13.854 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:58:13.854 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:58:13.854 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:58:13.854 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:58:13.854 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:58:13.854 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:58:13.854 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:58:13.854 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:58:13.855 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:58:20.032 134138 DEBUG oslo_service.periodic_task [req-1400c384-96cb-4fb8-856f-c38e80c577a7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:58:20.037 134138 DEBUG oslo_concurrency.lockutils [req-c118dbe5-d520-4c47-8c7c-ee432883fdf9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:58:20.038 134138 DEBUG oslo_concurrency.lockutils [req-c118dbe5-d520-4c47-8c7c-ee432883fdf9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:58:22.114 134140 DEBUG oslo_service.periodic_task [req-17487fc6-3494-4303-873d-b9186d1672fb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:58:22.118 134140 DEBUG oslo_concurrency.lockutils [req-58bbc422-fa83-4da3-b81d-cbbab68655af - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:58:22.119 134140 DEBUG oslo_concurrency.lockutils [req-58bbc422-fa83-4da3-b81d-cbbab68655af - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:58:25.052 134145 DEBUG oslo_service.periodic_task [req-4731e1dd-3fd1-43f7-ac00-1f0c639a9540 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:58:25.057 134145 DEBUG oslo_concurrency.lockutils [req-cbbafd49-ba72-46ac-b5ea-1da85afb1372 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:58:25.057 134145 DEBUG oslo_concurrency.lockutils [req-cbbafd49-ba72-46ac-b5ea-1da85afb1372 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:58:39.084 134146 DEBUG oslo_service.periodic_task [req-1ef46eae-615d-44de-879b-03e625b7954d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:58:39.089 134146 DEBUG oslo_concurrency.lockutils [req-fd4e491c-c424-4137-8438-c1fff735708c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:58:39.089 134146 DEBUG oslo_concurrency.lockutils [req-fd4e491c-c424-4137-8438-c1fff735708c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:58:51.019 134138 DEBUG oslo_service.periodic_task [req-c118dbe5-d520-4c47-8c7c-ee432883fdf9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:58:51.024 134138 DEBUG oslo_concurrency.lockutils [req-74a5570c-a9ab-4bfd-abb8-945354ace427 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:58:51.024 134138 DEBUG oslo_concurrency.lockutils [req-74a5570c-a9ab-4bfd-abb8-945354ace427 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:58:52.124 134140 DEBUG oslo_service.periodic_task [req-58bbc422-fa83-4da3-b81d-cbbab68655af - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:58:52.129 134140 DEBUG oslo_concurrency.lockutils [req-cd960f71-9e84-4596-b4e1-bf0721e75196 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:58:52.129 134140 DEBUG oslo_concurrency.lockutils [req-cd960f71-9e84-4596-b4e1-bf0721e75196 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:58:55.062 134145 DEBUG oslo_service.periodic_task [req-cbbafd49-ba72-46ac-b5ea-1da85afb1372 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:58:55.066 134145 DEBUG oslo_concurrency.lockutils [req-f4d1e2c9-c336-444c-bac7-07f58b2ce20a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:58:55.067 134145 DEBUG oslo_concurrency.lockutils [req-f4d1e2c9-c336-444c-bac7-07f58b2ce20a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:59:09.097 134146 DEBUG oslo_service.periodic_task [req-fd4e491c-c424-4137-8438-c1fff735708c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:59:09.101 134146 DEBUG oslo_concurrency.lockutils [req-cef8eb74-1c91-45ba-8a76-d9e92b391784 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:59:09.101 134146 DEBUG oslo_concurrency.lockutils [req-cef8eb74-1c91-45ba-8a76-d9e92b391784 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:59:14.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 185847f0c3f04c38b94e387836ad86b1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:59:14.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 185847f0c3f04c38b94e387836ad86b1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:59:14.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 185847f0c3f04c38b94e387836ad86b1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:59:14.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 185847f0c3f04c38b94e387836ad86b1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 00:59:14.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:14.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:14.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:14.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 185847f0c3f04c38b94e387836ad86b1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:59:14.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:14.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 185847f0c3f04c38b94e387836ad86b1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:59:14.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 185847f0c3f04c38b94e387836ad86b1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:59:14.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 185847f0c3f04c38b94e387836ad86b1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 00:59:14.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:14.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:14.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:14.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:14.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:14.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:14.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:14.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:14.827 134145 DEBUG oslo_concurrency.lockutils [req-d3110aaa-d191-4595-b7f6-15bb2ab689f6 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:59:14.827 134146 DEBUG oslo_concurrency.lockutils [req-d3110aaa-d191-4595-b7f6-15bb2ab689f6 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:59:14.827 134138 DEBUG oslo_concurrency.lockutils [req-d3110aaa-d191-4595-b7f6-15bb2ab689f6 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:59:14.827 134140 DEBUG oslo_concurrency.lockutils [req-d3110aaa-d191-4595-b7f6-15bb2ab689f6 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:59:14.827 134145 DEBUG nova.scheduler.host_manager [req-d3110aaa-d191-4595-b7f6-15bb2ab689f6 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:59:14.827 134146 DEBUG nova.scheduler.host_manager [req-d3110aaa-d191-4595-b7f6-15bb2ab689f6 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:59:14.827 134138 DEBUG nova.scheduler.host_manager [req-d3110aaa-d191-4595-b7f6-15bb2ab689f6 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:59:14.827 134140 DEBUG nova.scheduler.host_manager [req-d3110aaa-d191-4595-b7f6-15bb2ab689f6 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 00:59:14.827 134145 DEBUG oslo_concurrency.lockutils [req-d3110aaa-d191-4595-b7f6-15bb2ab689f6 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:59:14.827 134146 DEBUG oslo_concurrency.lockutils [req-d3110aaa-d191-4595-b7f6-15bb2ab689f6 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:59:14.828 134140 DEBUG oslo_concurrency.lockutils [req-d3110aaa-d191-4595-b7f6-15bb2ab689f6 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:59:14.828 134138 DEBUG oslo_concurrency.lockutils [req-d3110aaa-d191-4595-b7f6-15bb2ab689f6 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:59:14.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:14.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:14.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:14.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:14.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:14.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:14.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:14.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:14.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:14.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:14.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:14.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:15.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:15.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:15.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:15.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:15.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:15.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:15.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:15.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:15.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:15.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:15.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:15.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:17.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:17.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:17.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:17.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:17.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:17.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:17.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:17.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:17.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:17.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:17.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:17.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:21.032 134138 DEBUG oslo_service.periodic_task [req-74a5570c-a9ab-4bfd-abb8-945354ace427 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:59:21.036 134138 DEBUG oslo_concurrency.lockutils [req-e37c8c21-8fdd-46a8-8c5a-c411e5780071 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:59:21.036 134138 DEBUG oslo_concurrency.lockutils [req-e37c8c21-8fdd-46a8-8c5a-c411e5780071 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:59:21.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:21.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:21.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:21.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:21.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:21.836 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:21.836 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:21.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:21.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:21.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:21.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:21.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:22.135 134140 DEBUG oslo_service.periodic_task [req-cd960f71-9e84-4596-b4e1-bf0721e75196 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:59:22.139 134140 DEBUG oslo_concurrency.lockutils [req-7545f44d-b919-486b-a1a8-555f46112753 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:59:22.139 134140 DEBUG oslo_concurrency.lockutils [req-7545f44d-b919-486b-a1a8-555f46112753 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:59:26.051 134145 DEBUG oslo_service.periodic_task [req-f4d1e2c9-c336-444c-bac7-07f58b2ce20a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:59:26.056 134145 DEBUG oslo_concurrency.lockutils [req-e87ae39f-99a8-4702-bcb7-380412dfcb92 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:59:26.056 134145 DEBUG oslo_concurrency.lockutils [req-e87ae39f-99a8-4702-bcb7-380412dfcb92 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:59:29.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:29.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:29.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:29.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:29.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:29.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:29.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:29.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:29.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:29.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:29.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:29.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:39.110 134146 DEBUG oslo_service.periodic_task [req-cef8eb74-1c91-45ba-8a76-d9e92b391784 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:59:39.115 134146 DEBUG oslo_concurrency.lockutils [req-26ff2c95-64ed-48b7-b96a-0dfc7ff9ad09 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:59:39.115 134146 DEBUG oslo_concurrency.lockutils [req-26ff2c95-64ed-48b7-b96a-0dfc7ff9ad09 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:59:45.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:45.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:45.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:45.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:45.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:45.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:45.844 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:45.844 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:45.844 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:45.844 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 00:59:45.845 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 00:59:45.845 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 00:59:51.042 134138 DEBUG oslo_service.periodic_task [req-e37c8c21-8fdd-46a8-8c5a-c411e5780071 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:59:51.046 134138 DEBUG oslo_concurrency.lockutils [req-30f4ec0c-52b5-419f-9d87-8bb1563c541c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:59:51.046 134138 DEBUG oslo_concurrency.lockutils [req-30f4ec0c-52b5-419f-9d87-8bb1563c541c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:59:52.149 134140 DEBUG oslo_service.periodic_task [req-7545f44d-b919-486b-a1a8-555f46112753 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:59:52.153 134140 DEBUG oslo_concurrency.lockutils [req-d8030929-4f7d-4a8d-a4ee-3df7c1ece29b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:59:52.153 134140 DEBUG oslo_concurrency.lockutils [req-d8030929-4f7d-4a8d-a4ee-3df7c1ece29b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 00:59:56.062 134145 DEBUG oslo_service.periodic_task [req-e87ae39f-99a8-4702-bcb7-380412dfcb92 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 00:59:56.066 134145 DEBUG oslo_concurrency.lockutils [req-489207f6-8d94-478f-9609-9a76ee85b0f1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 00:59:56.066 134145 DEBUG oslo_concurrency.lockutils [req-489207f6-8d94-478f-9609-9a76ee85b0f1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:00:09.122 134146 DEBUG oslo_service.periodic_task [req-26ff2c95-64ed-48b7-b96a-0dfc7ff9ad09 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:00:09.126 134146 DEBUG oslo_concurrency.lockutils [req-ab466146-2da6-4f23-8ca2-df7707ce1708 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:00:09.127 134146 DEBUG oslo_concurrency.lockutils [req-ab466146-2da6-4f23-8ca2-df7707ce1708 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:00:17.843 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:00:17.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:00:17.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:00:17.848 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:00:17.849 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:00:17.849 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:00:17.849 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:00:17.850 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:00:17.850 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:00:17.850 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:00:17.850 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:00:17.850 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:00:21.054 134138 DEBUG oslo_service.periodic_task [req-30f4ec0c-52b5-419f-9d87-8bb1563c541c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:00:21.058 134138 DEBUG oslo_concurrency.lockutils [req-b33ec7d4-33ec-4fa9-bfe7-6e857002210f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:00:21.059 134138 DEBUG oslo_concurrency.lockutils [req-b33ec7d4-33ec-4fa9-bfe7-6e857002210f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:00:23.053 134140 DEBUG oslo_service.periodic_task [req-d8030929-4f7d-4a8d-a4ee-3df7c1ece29b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:00:23.058 134140 DEBUG oslo_concurrency.lockutils [req-eebb84fc-c0c8-43d2-9845-7032aba7535f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:00:23.059 134140 DEBUG oslo_concurrency.lockutils [req-eebb84fc-c0c8-43d2-9845-7032aba7535f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:00:26.073 134145 DEBUG oslo_service.periodic_task [req-489207f6-8d94-478f-9609-9a76ee85b0f1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:00:26.077 134145 DEBUG oslo_concurrency.lockutils [req-7f5ea338-292f-466f-bc39-734d55884b1e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:00:26.078 134145 DEBUG oslo_concurrency.lockutils [req-7f5ea338-292f-466f-bc39-734d55884b1e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:00:40.070 134146 DEBUG oslo_service.periodic_task [req-ab466146-2da6-4f23-8ca2-df7707ce1708 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:00:40.074 134146 DEBUG oslo_concurrency.lockutils [req-df7da663-f0da-474f-8d11-ef986ae1b73f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:00:40.074 134146 DEBUG oslo_concurrency.lockutils [req-df7da663-f0da-474f-8d11-ef986ae1b73f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:00:52.020 134138 DEBUG oslo_service.periodic_task [req-b33ec7d4-33ec-4fa9-bfe7-6e857002210f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:00:52.024 134138 DEBUG oslo_concurrency.lockutils [req-81b14d78-69ab-495f-801d-5c30d65e5ddb - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:00:52.025 134138 DEBUG oslo_concurrency.lockutils [req-81b14d78-69ab-495f-801d-5c30d65e5ddb - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:00:53.066 134140 DEBUG oslo_service.periodic_task [req-eebb84fc-c0c8-43d2-9845-7032aba7535f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:00:53.070 134140 DEBUG oslo_concurrency.lockutils [req-f019da4e-fd95-4a5b-82da-3a689d78d005 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:00:53.071 134140 DEBUG oslo_concurrency.lockutils [req-f019da4e-fd95-4a5b-82da-3a689d78d005 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:00:57.054 134145 DEBUG oslo_service.periodic_task [req-7f5ea338-292f-466f-bc39-734d55884b1e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:00:57.063 134145 DEBUG oslo_concurrency.lockutils [req-cac37273-8ccb-4c3b-87ad-5b2afdbd3702 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:00:57.063 134145 DEBUG oslo_concurrency.lockutils [req-cac37273-8ccb-4c3b-87ad-5b2afdbd3702 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:01:10.084 134146 DEBUG oslo_service.periodic_task [req-df7da663-f0da-474f-8d11-ef986ae1b73f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:01:10.094 134146 DEBUG oslo_concurrency.lockutils [req-176c8b44-8a96-4495-8f9a-4fe9bb34d0b0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:01:10.095 134146 DEBUG oslo_concurrency.lockutils [req-176c8b44-8a96-4495-8f9a-4fe9bb34d0b0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:01:16.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4ec5fe2829334bb1994e1b7c5670c2d9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:01:16.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4ec5fe2829334bb1994e1b7c5670c2d9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:01:16.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4ec5fe2829334bb1994e1b7c5670c2d9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:01:16.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4ec5fe2829334bb1994e1b7c5670c2d9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:01:16.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:16.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:16.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:16.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4ec5fe2829334bb1994e1b7c5670c2d9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:01:16.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4ec5fe2829334bb1994e1b7c5670c2d9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:01:16.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4ec5fe2829334bb1994e1b7c5670c2d9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:01:16.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:16.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:16.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:16.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:16.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4ec5fe2829334bb1994e1b7c5670c2d9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:01:16.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:16.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:16.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:16.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:16.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:16.827 134145 DEBUG oslo_concurrency.lockutils [req-1c17ef1d-bd00-4130-8184-1bb3b715891d - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:01:16.827 134146 DEBUG oslo_concurrency.lockutils [req-1c17ef1d-bd00-4130-8184-1bb3b715891d - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:01:16.827 134138 DEBUG oslo_concurrency.lockutils [req-1c17ef1d-bd00-4130-8184-1bb3b715891d - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:01:16.827 134145 DEBUG nova.scheduler.host_manager [req-1c17ef1d-bd00-4130-8184-1bb3b715891d - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:01:16.827 134146 DEBUG nova.scheduler.host_manager [req-1c17ef1d-bd00-4130-8184-1bb3b715891d - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:01:16.827 134138 DEBUG nova.scheduler.host_manager [req-1c17ef1d-bd00-4130-8184-1bb3b715891d - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:01:16.827 134145 DEBUG oslo_concurrency.lockutils [req-1c17ef1d-bd00-4130-8184-1bb3b715891d - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:01:16.827 134146 DEBUG oslo_concurrency.lockutils [req-1c17ef1d-bd00-4130-8184-1bb3b715891d - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:01:16.827 134140 DEBUG oslo_concurrency.lockutils [req-1c17ef1d-bd00-4130-8184-1bb3b715891d - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:01:16.827 134138 DEBUG oslo_concurrency.lockutils [req-1c17ef1d-bd00-4130-8184-1bb3b715891d - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:01:16.828 134140 DEBUG nova.scheduler.host_manager [req-1c17ef1d-bd00-4130-8184-1bb3b715891d - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:01:16.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:16.828 134140 DEBUG oslo_concurrency.lockutils [req-1c17ef1d-bd00-4130-8184-1bb3b715891d - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:01:16.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:16.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:16.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:16.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:16.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:16.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:16.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:16.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:16.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:16.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:16.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:17.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:17.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:17.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:17.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:17.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:17.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:17.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:17.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:17.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:17.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:17.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:17.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:19.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:19.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:19.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:19.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:19.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:19.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:19.833 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:19.833 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:19.833 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:19.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:19.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:19.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:23.020 134138 DEBUG oslo_service.periodic_task [req-81b14d78-69ab-495f-801d-5c30d65e5ddb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:01:23.024 134138 DEBUG oslo_concurrency.lockutils [req-d8c0c9b9-8ee3-48fc-8250-69c9dcd0611e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:01:23.024 134138 DEBUG oslo_concurrency.lockutils [req-d8c0c9b9-8ee3-48fc-8250-69c9dcd0611e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:01:23.078 134140 DEBUG oslo_service.periodic_task [req-f019da4e-fd95-4a5b-82da-3a689d78d005 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:01:23.082 134140 DEBUG oslo_concurrency.lockutils [req-e4d4b6c5-9893-4f77-bf24-562c3db42901 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:01:23.082 134140 DEBUG oslo_concurrency.lockutils [req-e4d4b6c5-9893-4f77-bf24-562c3db42901 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:01:23.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:23.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:23.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:23.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:23.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:23.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:23.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:23.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:23.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:23.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:23.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:23.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:27.069 134145 DEBUG oslo_service.periodic_task [req-cac37273-8ccb-4c3b-87ad-5b2afdbd3702 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:01:27.074 134145 DEBUG oslo_concurrency.lockutils [req-24a66e80-504c-46a8-8ee0-b0cc40132aa0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:01:27.075 134145 DEBUG oslo_concurrency.lockutils [req-24a66e80-504c-46a8-8ee0-b0cc40132aa0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:01:31.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:31.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:31.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:31.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:31.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:31.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:31.843 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:31.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:31.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:31.844 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:31.845 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:31.845 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:41.070 134146 DEBUG oslo_service.periodic_task [req-176c8b44-8a96-4495-8f9a-4fe9bb34d0b0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:01:41.074 134146 DEBUG oslo_concurrency.lockutils [req-eb792a54-a408-4025-885e-c2c09468562c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:01:41.074 134146 DEBUG oslo_concurrency.lockutils [req-eb792a54-a408-4025-885e-c2c09468562c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:01:47.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:47.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:47.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:47.844 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:47.845 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:47.845 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:47.845 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:47.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:47.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:47.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:01:47.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:01:47.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:01:53.090 134140 DEBUG oslo_service.periodic_task [req-e4d4b6c5-9893-4f77-bf24-562c3db42901 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:01:53.094 134140 DEBUG oslo_concurrency.lockutils [req-2bcf51f8-81f1-4a72-9174-b0fe8192badd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:01:53.095 134140 DEBUG oslo_concurrency.lockutils [req-2bcf51f8-81f1-4a72-9174-b0fe8192badd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:01:54.020 134138 DEBUG oslo_service.periodic_task [req-d8c0c9b9-8ee3-48fc-8250-69c9dcd0611e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:01:54.024 134138 DEBUG oslo_concurrency.lockutils [req-5da89efb-41c3-49b5-8ce0-d095f0d5f1a0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:01:54.024 134138 DEBUG oslo_concurrency.lockutils [req-5da89efb-41c3-49b5-8ce0-d095f0d5f1a0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:01:57.083 134145 DEBUG oslo_service.periodic_task [req-24a66e80-504c-46a8-8ee0-b0cc40132aa0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:01:57.087 134145 DEBUG oslo_concurrency.lockutils [req-6c59534b-f90c-48c2-95bb-2546392ca892 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:01:57.087 134145 DEBUG oslo_concurrency.lockutils [req-6c59534b-f90c-48c2-95bb-2546392ca892 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:02:11.089 134146 DEBUG oslo_service.periodic_task [req-eb792a54-a408-4025-885e-c2c09468562c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:02:11.093 134146 DEBUG oslo_concurrency.lockutils [req-87c0e651-f9b0-4b3f-bd78-fd57570496cf - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:02:11.093 134146 DEBUG oslo_concurrency.lockutils [req-87c0e651-f9b0-4b3f-bd78-fd57570496cf - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:02:23.105 134140 DEBUG oslo_service.periodic_task [req-2bcf51f8-81f1-4a72-9174-b0fe8192badd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:02:23.109 134140 DEBUG oslo_concurrency.lockutils [req-493dfd30-01c8-4d98-9b43-771b0f710d8f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:02:23.109 134140 DEBUG oslo_concurrency.lockutils [req-493dfd30-01c8-4d98-9b43-771b0f710d8f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:02:25.021 134138 DEBUG oslo_service.periodic_task [req-5da89efb-41c3-49b5-8ce0-d095f0d5f1a0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:02:25.026 134138 DEBUG oslo_concurrency.lockutils [req-cea38ab0-dd03-4e72-a54e-2eda47b678df - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:02:25.026 134138 DEBUG oslo_concurrency.lockutils [req-cea38ab0-dd03-4e72-a54e-2eda47b678df - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:02:27.094 134145 DEBUG oslo_service.periodic_task [req-6c59534b-f90c-48c2-95bb-2546392ca892 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:02:27.099 134145 DEBUG oslo_concurrency.lockutils [req-9b2f568c-d4f8-4050-9d42-df9b7feaf364 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:02:27.099 134145 DEBUG oslo_concurrency.lockutils [req-9b2f568c-d4f8-4050-9d42-df9b7feaf364 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:02:33.031 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:02:33.031 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:02:33.031 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:02:33.071 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:02:33.072 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:02:33.072 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:02:33.072 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:02:33.073 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:02:33.073 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:02:33.098 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:02:33.098 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:02:33.098 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:02:41.103 134146 DEBUG oslo_service.periodic_task [req-87c0e651-f9b0-4b3f-bd78-fd57570496cf - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:02:41.107 134146 DEBUG oslo_concurrency.lockutils [req-23451119-1ed4-468d-b8f8-e3235985464f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:02:41.107 134146 DEBUG oslo_concurrency.lockutils [req-23451119-1ed4-468d-b8f8-e3235985464f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:02:54.052 134140 DEBUG oslo_service.periodic_task [req-493dfd30-01c8-4d98-9b43-771b0f710d8f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:02:54.056 134140 DEBUG oslo_concurrency.lockutils [req-b34e59b4-b317-4254-865e-3472e2cb803e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:02:54.056 134140 DEBUG oslo_concurrency.lockutils [req-b34e59b4-b317-4254-865e-3472e2cb803e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:02:56.020 134138 DEBUG oslo_service.periodic_task [req-cea38ab0-dd03-4e72-a54e-2eda47b678df - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:02:56.025 134138 DEBUG oslo_concurrency.lockutils [req-5540cfb3-b0fd-4784-841a-7c6e6f928e15 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:02:56.025 134138 DEBUG oslo_concurrency.lockutils [req-5540cfb3-b0fd-4784-841a-7c6e6f928e15 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:02:58.052 134145 DEBUG oslo_service.periodic_task [req-9b2f568c-d4f8-4050-9d42-df9b7feaf364 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:02:58.056 134145 DEBUG oslo_concurrency.lockutils [req-a1761608-6eea-4871-8067-dc30fb04a447 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:02:58.056 134145 DEBUG oslo_concurrency.lockutils [req-a1761608-6eea-4871-8067-dc30fb04a447 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:03:11.119 134146 DEBUG oslo_service.periodic_task [req-23451119-1ed4-468d-b8f8-e3235985464f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:03:11.123 134146 DEBUG oslo_concurrency.lockutils [req-02a35130-2b07-431f-b9ed-84ca1c2a31ec - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:03:11.124 134146 DEBUG oslo_concurrency.lockutils [req-02a35130-2b07-431f-b9ed-84ca1c2a31ec - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:03:20.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c4b22c885c0e4fa8adbb964ae21811a0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:03:20.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c4b22c885c0e4fa8adbb964ae21811a0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:03:20.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:20.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:20.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c4b22c885c0e4fa8adbb964ae21811a0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:03:20.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c4b22c885c0e4fa8adbb964ae21811a0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:03:20.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c4b22c885c0e4fa8adbb964ae21811a0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:03:20.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c4b22c885c0e4fa8adbb964ae21811a0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:03:20.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:20.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:20.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:20.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:20.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:20.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:20.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c4b22c885c0e4fa8adbb964ae21811a0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:03:20.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c4b22c885c0e4fa8adbb964ae21811a0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:03:20.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:20.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:20.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:20.827 134145 DEBUG oslo_concurrency.lockutils [req-23bcb33c-876c-4051-b5b1-151c1a82b4cc - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:03:20.827 134138 DEBUG oslo_concurrency.lockutils [req-23bcb33c-876c-4051-b5b1-151c1a82b4cc - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:03:20.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:20.828 134138 DEBUG nova.scheduler.host_manager [req-23bcb33c-876c-4051-b5b1-151c1a82b4cc - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:03:20.828 134145 DEBUG nova.scheduler.host_manager [req-23bcb33c-876c-4051-b5b1-151c1a82b4cc - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:03:20.828 134140 DEBUG oslo_concurrency.lockutils [req-23bcb33c-876c-4051-b5b1-151c1a82b4cc - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:03:20.829 134138 DEBUG oslo_concurrency.lockutils [req-23bcb33c-876c-4051-b5b1-151c1a82b4cc - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:03:20.829 134145 DEBUG oslo_concurrency.lockutils [req-23bcb33c-876c-4051-b5b1-151c1a82b4cc - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:03:20.829 134140 DEBUG nova.scheduler.host_manager [req-23bcb33c-876c-4051-b5b1-151c1a82b4cc - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:03:20.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:20.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:20.829 134140 DEBUG oslo_concurrency.lockutils [req-23bcb33c-876c-4051-b5b1-151c1a82b4cc - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:03:20.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:20.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:20.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:20.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:20.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:20.829 134146 DEBUG oslo_concurrency.lockutils [req-23bcb33c-876c-4051-b5b1-151c1a82b4cc - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:03:20.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:20.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:20.829 134146 DEBUG nova.scheduler.host_manager [req-23bcb33c-876c-4051-b5b1-151c1a82b4cc - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:03:20.830 134146 DEBUG oslo_concurrency.lockutils [req-23bcb33c-876c-4051-b5b1-151c1a82b4cc - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:03:20.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:20.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:20.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:21.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:21.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:21.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:21.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:21.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:21.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:21.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:21.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:21.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:21.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:21.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:21.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:23.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:23.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:23.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:23.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:23.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:23.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:23.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:23.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:23.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:23.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:23.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:23.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:25.052 134140 DEBUG oslo_service.periodic_task [req-b34e59b4-b317-4254-865e-3472e2cb803e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:03:25.057 134140 DEBUG oslo_concurrency.lockutils [req-60aa4c16-f5bc-4b32-ab8b-ecd457b077c2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:03:25.058 134140 DEBUG oslo_concurrency.lockutils [req-60aa4c16-f5bc-4b32-ab8b-ecd457b077c2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:03:27.020 134138 DEBUG oslo_service.periodic_task [req-5540cfb3-b0fd-4784-841a-7c6e6f928e15 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:03:27.024 134138 DEBUG oslo_concurrency.lockutils [req-956b3908-0211-4975-9dd2-321f8e6ca0d2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:03:27.025 134138 DEBUG oslo_concurrency.lockutils [req-956b3908-0211-4975-9dd2-321f8e6ca0d2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:03:27.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:27.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:27.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:27.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:27.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:27.837 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:27.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:27.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:27.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:27.839 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:27.839 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:27.840 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:28.062 134145 DEBUG oslo_service.periodic_task [req-a1761608-6eea-4871-8067-dc30fb04a447 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:03:28.066 134145 DEBUG oslo_concurrency.lockutils [req-24b054e7-3329-4f5f-928f-9549bddd2be1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:03:28.067 134145 DEBUG oslo_concurrency.lockutils [req-24b054e7-3329-4f5f-928f-9549bddd2be1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:03:35.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:35.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:35.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:35.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:35.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:35.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:35.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:35.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:35.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:35.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:35.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:35.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:42.070 134146 DEBUG oslo_service.periodic_task [req-02a35130-2b07-431f-b9ed-84ca1c2a31ec - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:03:42.074 134146 DEBUG oslo_concurrency.lockutils [req-7f8afa2f-cc52-4fc4-800d-373d01f028ea - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:03:42.074 134146 DEBUG oslo_concurrency.lockutils [req-7f8afa2f-cc52-4fc4-800d-373d01f028ea - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:03:51.839 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:51.839 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:51.839 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:51.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:51.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:51.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:51.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:51.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:51.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:51.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:03:51.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:03:51.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:03:55.063 134140 DEBUG oslo_service.periodic_task [req-60aa4c16-f5bc-4b32-ab8b-ecd457b077c2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:03:55.068 134140 DEBUG oslo_concurrency.lockutils [req-34f689af-8c62-4d1d-b350-242d00acd644 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:03:55.068 134140 DEBUG oslo_concurrency.lockutils [req-34f689af-8c62-4d1d-b350-242d00acd644 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:03:58.020 134138 DEBUG oslo_service.periodic_task [req-956b3908-0211-4975-9dd2-321f8e6ca0d2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:03:58.024 134138 DEBUG oslo_concurrency.lockutils [req-e331cbd7-5087-4c69-943d-e4504f5d2390 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:03:58.025 134138 DEBUG oslo_concurrency.lockutils [req-e331cbd7-5087-4c69-943d-e4504f5d2390 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:03:59.052 134145 DEBUG oslo_service.periodic_task [req-24b054e7-3329-4f5f-928f-9549bddd2be1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:03:59.056 134145 DEBUG oslo_concurrency.lockutils [req-7e61d645-b2a1-4c8e-8b03-e36e5246f95d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:03:59.056 134145 DEBUG oslo_concurrency.lockutils [req-7e61d645-b2a1-4c8e-8b03-e36e5246f95d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:04:12.085 134146 DEBUG oslo_service.periodic_task [req-7f8afa2f-cc52-4fc4-800d-373d01f028ea - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:04:12.089 134146 DEBUG oslo_concurrency.lockutils [req-0ef4a4c3-4936-4169-8156-97472341b4f9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:04:12.090 134146 DEBUG oslo_concurrency.lockutils [req-0ef4a4c3-4936-4169-8156-97472341b4f9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:04:25.074 134140 DEBUG oslo_service.periodic_task [req-34f689af-8c62-4d1d-b350-242d00acd644 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:04:25.078 134140 DEBUG oslo_concurrency.lockutils [req-3d728c29-6929-483f-84f8-41513c8374cf - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:04:25.079 134140 DEBUG oslo_concurrency.lockutils [req-3d728c29-6929-483f-84f8-41513c8374cf - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:04:29.020 134138 DEBUG oslo_service.periodic_task [req-e331cbd7-5087-4c69-943d-e4504f5d2390 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:04:29.024 134138 DEBUG oslo_concurrency.lockutils [req-3faaf72b-341d-4397-9b21-5770923e53ef - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:04:29.025 134138 DEBUG oslo_concurrency.lockutils [req-3faaf72b-341d-4397-9b21-5770923e53ef - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:04:30.051 134145 DEBUG oslo_service.periodic_task [req-7e61d645-b2a1-4c8e-8b03-e36e5246f95d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:04:30.056 134145 DEBUG oslo_concurrency.lockutils [req-c7d11572-05cb-4cb9-8c6a-fce9fb34b675 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:04:30.056 134145 DEBUG oslo_concurrency.lockutils [req-c7d11572-05cb-4cb9-8c6a-fce9fb34b675 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:04:33.033 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:04:33.034 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:04:33.034 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:04:33.073 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:04:33.073 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:04:33.073 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:04:33.076 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:04:33.076 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:04:33.076 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:04:33.103 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:04:33.103 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:04:33.103 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:04:42.100 134146 DEBUG oslo_service.periodic_task [req-0ef4a4c3-4936-4169-8156-97472341b4f9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:04:42.104 134146 DEBUG oslo_concurrency.lockutils [req-4270eeca-a9bc-4c43-aac7-54f251364eec - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:04:42.105 134146 DEBUG oslo_concurrency.lockutils [req-4270eeca-a9bc-4c43-aac7-54f251364eec - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:04:55.084 134140 DEBUG oslo_service.periodic_task [req-3d728c29-6929-483f-84f8-41513c8374cf - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:04:55.088 134140 DEBUG oslo_concurrency.lockutils [req-73d8aa63-4e02-46ec-8fcc-e38a0bc1418c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:04:55.088 134140 DEBUG oslo_concurrency.lockutils [req-73d8aa63-4e02-46ec-8fcc-e38a0bc1418c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:04:59.034 134138 DEBUG oslo_service.periodic_task [req-3faaf72b-341d-4397-9b21-5770923e53ef - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:04:59.038 134138 DEBUG oslo_concurrency.lockutils [req-53919d78-eed6-4c1b-9426-f701da303ecd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:04:59.038 134138 DEBUG oslo_concurrency.lockutils [req-53919d78-eed6-4c1b-9426-f701da303ecd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:05:00.064 134145 DEBUG oslo_service.periodic_task [req-c7d11572-05cb-4cb9-8c6a-fce9fb34b675 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:05:00.068 134145 DEBUG oslo_concurrency.lockutils [req-b116fbc7-fa29-4cc7-b0b4-12d784453db4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:05:00.069 134145 DEBUG oslo_concurrency.lockutils [req-b116fbc7-fa29-4cc7-b0b4-12d784453db4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:05:12.118 134146 DEBUG oslo_service.periodic_task [req-4270eeca-a9bc-4c43-aac7-54f251364eec - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:05:12.123 134146 DEBUG oslo_concurrency.lockutils [req-cf9118b0-52b9-4732-83c1-1fab6e9f30b5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:05:12.123 134146 DEBUG oslo_concurrency.lockutils [req-cf9118b0-52b9-4732-83c1-1fab6e9f30b5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:05:22.824 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 10d213265be34402b7d7326376d62db8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:05:22.824 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 10d213265be34402b7d7326376d62db8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:05:22.824 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 10d213265be34402b7d7326376d62db8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:05:22.825 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:22.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:22.825 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:22.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 10d213265be34402b7d7326376d62db8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:05:22.825 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 10d213265be34402b7d7326376d62db8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:05:22.825 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 10d213265be34402b7d7326376d62db8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:05:22.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:22.825 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:22.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:22.825 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:22.825 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:22.825 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:22.826 134145 DEBUG oslo_concurrency.lockutils [req-d44bc03c-3d7f-4659-92ba-655d38b5514c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:05:22.826 134138 DEBUG oslo_concurrency.lockutils [req-d44bc03c-3d7f-4659-92ba-655d38b5514c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:05:22.826 134140 DEBUG oslo_concurrency.lockutils [req-d44bc03c-3d7f-4659-92ba-655d38b5514c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:05:22.826 134145 DEBUG nova.scheduler.host_manager [req-d44bc03c-3d7f-4659-92ba-655d38b5514c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:05:22.826 134138 DEBUG nova.scheduler.host_manager [req-d44bc03c-3d7f-4659-92ba-655d38b5514c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:05:22.826 134140 DEBUG nova.scheduler.host_manager [req-d44bc03c-3d7f-4659-92ba-655d38b5514c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:05:22.826 134145 DEBUG oslo_concurrency.lockutils [req-d44bc03c-3d7f-4659-92ba-655d38b5514c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:05:22.826 134138 DEBUG oslo_concurrency.lockutils [req-d44bc03c-3d7f-4659-92ba-655d38b5514c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:05:22.827 134140 DEBUG oslo_concurrency.lockutils [req-d44bc03c-3d7f-4659-92ba-655d38b5514c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:05:22.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:22.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:22.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:22.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:22.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:22.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:22.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:22.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:22.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:22.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 10d213265be34402b7d7326376d62db8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:05:22.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:22.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 10d213265be34402b7d7326376d62db8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:05:22.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:22.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:22.829 134146 DEBUG oslo_concurrency.lockutils [req-d44bc03c-3d7f-4659-92ba-655d38b5514c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:05:22.829 134146 DEBUG nova.scheduler.host_manager [req-d44bc03c-3d7f-4659-92ba-655d38b5514c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:05:22.830 134146 DEBUG oslo_concurrency.lockutils [req-d44bc03c-3d7f-4659-92ba-655d38b5514c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:05:22.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:22.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:22.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:23.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:23.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:23.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:23.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:23.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:23.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:23.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:23.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:23.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:23.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:23.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:23.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:25.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:25.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:25.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:25.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:25.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:25.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:25.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:25.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:25.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:25.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:25.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:25.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:26.051 134140 DEBUG oslo_service.periodic_task [req-73d8aa63-4e02-46ec-8fcc-e38a0bc1418c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:05:26.055 134140 DEBUG oslo_concurrency.lockutils [req-58e21a9e-4fbd-4234-9f78-bef85af6c1fa - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:05:26.056 134140 DEBUG oslo_concurrency.lockutils [req-58e21a9e-4fbd-4234-9f78-bef85af6c1fa - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:05:29.046 134138 DEBUG oslo_service.periodic_task [req-53919d78-eed6-4c1b-9426-f701da303ecd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:05:29.050 134138 DEBUG oslo_concurrency.lockutils [req-d0c054c2-5e5d-4124-968b-261fcfd57ca3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:05:29.051 134138 DEBUG oslo_concurrency.lockutils [req-d0c054c2-5e5d-4124-968b-261fcfd57ca3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:05:29.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:29.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:29.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:29.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:29.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:29.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:29.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:29.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:29.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:29.839 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:29.839 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:29.839 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:30.073 134145 DEBUG oslo_service.periodic_task [req-b116fbc7-fa29-4cc7-b0b4-12d784453db4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:05:30.078 134145 DEBUG oslo_concurrency.lockutils [req-75fca0b1-ba33-4be0-88ef-ae1be61d1e63 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:05:30.078 134145 DEBUG oslo_concurrency.lockutils [req-75fca0b1-ba33-4be0-88ef-ae1be61d1e63 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:05:37.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:37.837 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:37.837 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:37.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:37.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:37.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:37.839 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:37.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:37.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:37.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:37.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:37.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:42.130 134146 DEBUG oslo_service.periodic_task [req-cf9118b0-52b9-4732-83c1-1fab6e9f30b5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:05:42.135 134146 DEBUG oslo_concurrency.lockutils [req-be6fbbcf-e2f9-438c-a16a-caf5e71b4077 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:05:42.135 134146 DEBUG oslo_concurrency.lockutils [req-be6fbbcf-e2f9-438c-a16a-caf5e71b4077 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:05:53.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:53.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:53.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:53.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:53.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:53.841 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:53.841 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:53.841 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:53.841 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:53.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:05:53.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:05:53.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:05:56.061 134140 DEBUG oslo_service.periodic_task [req-58e21a9e-4fbd-4234-9f78-bef85af6c1fa - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:05:56.065 134140 DEBUG oslo_concurrency.lockutils [req-5f8c3c7d-8652-409a-8c25-1729a5fcab20 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:05:56.066 134140 DEBUG oslo_concurrency.lockutils [req-5f8c3c7d-8652-409a-8c25-1729a5fcab20 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:06:00.020 134138 DEBUG oslo_service.periodic_task [req-d0c054c2-5e5d-4124-968b-261fcfd57ca3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:06:00.025 134138 DEBUG oslo_concurrency.lockutils [req-3bc8b911-a253-4747-9436-800109557800 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:06:00.025 134138 DEBUG oslo_concurrency.lockutils [req-3bc8b911-a253-4747-9436-800109557800 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:06:01.053 134145 DEBUG oslo_service.periodic_task [req-75fca0b1-ba33-4be0-88ef-ae1be61d1e63 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:06:01.059 134145 DEBUG oslo_concurrency.lockutils [req-0d632cab-cd8b-4286-88ec-9a2934981806 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:06:01.059 134145 DEBUG oslo_concurrency.lockutils [req-0d632cab-cd8b-4286-88ec-9a2934981806 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:06:13.069 134146 DEBUG oslo_service.periodic_task [req-be6fbbcf-e2f9-438c-a16a-caf5e71b4077 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:06:13.074 134146 DEBUG oslo_concurrency.lockutils [req-e72191ca-1044-4333-8c10-f41b01e31640 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:06:13.074 134146 DEBUG oslo_concurrency.lockutils [req-e72191ca-1044-4333-8c10-f41b01e31640 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:06:26.071 134140 DEBUG oslo_service.periodic_task [req-5f8c3c7d-8652-409a-8c25-1729a5fcab20 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:06:26.075 134140 DEBUG oslo_concurrency.lockutils [req-3518275f-0337-40fa-8086-363f28945f5a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:06:26.075 134140 DEBUG oslo_concurrency.lockutils [req-3518275f-0337-40fa-8086-363f28945f5a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:06:30.035 134138 DEBUG oslo_service.periodic_task [req-3bc8b911-a253-4747-9436-800109557800 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:06:30.040 134138 DEBUG oslo_concurrency.lockutils [req-28e59404-dca1-40f4-a9c8-48c8aa4d9960 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:06:30.040 134138 DEBUG oslo_concurrency.lockutils [req-28e59404-dca1-40f4-a9c8-48c8aa4d9960 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:06:32.052 134145 DEBUG oslo_service.periodic_task [req-0d632cab-cd8b-4286-88ec-9a2934981806 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:06:32.056 134145 DEBUG oslo_concurrency.lockutils [req-15fa4167-7086-4d7c-b250-eecebe76b5c4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:06:32.056 134145 DEBUG oslo_concurrency.lockutils [req-15fa4167-7086-4d7c-b250-eecebe76b5c4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:06:33.036 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:06:33.037 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:06:33.038 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:06:33.076 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:06:33.076 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:06:33.077 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:06:33.079 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:06:33.079 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:06:33.079 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:06:33.107 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:06:33.107 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:06:33.108 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:06:43.086 134146 DEBUG oslo_service.periodic_task [req-e72191ca-1044-4333-8c10-f41b01e31640 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:06:43.090 134146 DEBUG oslo_concurrency.lockutils [req-31dd5945-a24a-4422-810c-2e88d35fae64 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:06:43.091 134146 DEBUG oslo_concurrency.lockutils [req-31dd5945-a24a-4422-810c-2e88d35fae64 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:06:56.081 134140 DEBUG oslo_service.periodic_task [req-3518275f-0337-40fa-8086-363f28945f5a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:06:56.085 134140 DEBUG oslo_concurrency.lockutils [req-68b66bfd-1d84-47af-94f5-a461dbbdb634 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:06:56.086 134140 DEBUG oslo_concurrency.lockutils [req-68b66bfd-1d84-47af-94f5-a461dbbdb634 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:07:01.020 134138 DEBUG oslo_service.periodic_task [req-28e59404-dca1-40f4-a9c8-48c8aa4d9960 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:07:01.025 134138 DEBUG oslo_concurrency.lockutils [req-9ab282b9-f74f-414c-9dce-9fccfc90d9a3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:07:01.025 134138 DEBUG oslo_concurrency.lockutils [req-9ab282b9-f74f-414c-9dce-9fccfc90d9a3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:07:03.052 134145 DEBUG oslo_service.periodic_task [req-15fa4167-7086-4d7c-b250-eecebe76b5c4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:07:03.056 134145 DEBUG oslo_concurrency.lockutils [req-f22c95d3-e7c1-4adb-a3d6-02619467daa8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:07:03.056 134145 DEBUG oslo_concurrency.lockutils [req-f22c95d3-e7c1-4adb-a3d6-02619467daa8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:07:13.104 134146 DEBUG oslo_service.periodic_task [req-31dd5945-a24a-4422-810c-2e88d35fae64 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:07:13.109 134146 DEBUG oslo_concurrency.lockutils [req-3ee0e733-b400-4145-8f64-e1e1cc2fff7b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:07:13.109 134146 DEBUG oslo_concurrency.lockutils [req-3ee0e733-b400-4145-8f64-e1e1cc2fff7b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:07:26.093 134140 DEBUG oslo_service.periodic_task [req-68b66bfd-1d84-47af-94f5-a461dbbdb634 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:07:26.097 134140 DEBUG oslo_concurrency.lockutils [req-5deca433-2721-4a3b-8e8b-2ec6f41a8f27 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:07:26.097 134140 DEBUG oslo_concurrency.lockutils [req-5deca433-2721-4a3b-8e8b-2ec6f41a8f27 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:07:26.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 795af1fc16f54989a6d7236a55298e98 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:07:26.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 795af1fc16f54989a6d7236a55298e98 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:07:26.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 795af1fc16f54989a6d7236a55298e98 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:07:26.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:26.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:26.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 795af1fc16f54989a6d7236a55298e98 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:07:26.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 795af1fc16f54989a6d7236a55298e98 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:07:26.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:26.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 795af1fc16f54989a6d7236a55298e98 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:07:26.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:26.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:26.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:26.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:26.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:26.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:26.829 134138 DEBUG oslo_concurrency.lockutils [req-98eca203-6f8e-47dd-96ef-8c86a604cee1 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:07:26.829 134146 DEBUG oslo_concurrency.lockutils [req-98eca203-6f8e-47dd-96ef-8c86a604cee1 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:07:26.829 134138 DEBUG nova.scheduler.host_manager [req-98eca203-6f8e-47dd-96ef-8c86a604cee1 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:07:26.829 134145 DEBUG oslo_concurrency.lockutils [req-98eca203-6f8e-47dd-96ef-8c86a604cee1 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:07:26.829 134138 DEBUG oslo_concurrency.lockutils [req-98eca203-6f8e-47dd-96ef-8c86a604cee1 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:07:26.829 134145 DEBUG nova.scheduler.host_manager [req-98eca203-6f8e-47dd-96ef-8c86a604cee1 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:07:26.829 134146 DEBUG nova.scheduler.host_manager [req-98eca203-6f8e-47dd-96ef-8c86a604cee1 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:07:26.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:26.830 134145 DEBUG oslo_concurrency.lockutils [req-98eca203-6f8e-47dd-96ef-8c86a604cee1 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:07:26.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:26.830 134146 DEBUG oslo_concurrency.lockutils [req-98eca203-6f8e-47dd-96ef-8c86a604cee1 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:07:26.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:26.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:26.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:26.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:26.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:26.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:26.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:26.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 795af1fc16f54989a6d7236a55298e98 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:07:26.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:26.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 795af1fc16f54989a6d7236a55298e98 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:07:26.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:26.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:26.833 134140 DEBUG oslo_concurrency.lockutils [req-98eca203-6f8e-47dd-96ef-8c86a604cee1 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:07:26.833 134140 DEBUG nova.scheduler.host_manager [req-98eca203-6f8e-47dd-96ef-8c86a604cee1 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:07:26.833 134140 DEBUG oslo_concurrency.lockutils [req-98eca203-6f8e-47dd-96ef-8c86a604cee1 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:07:26.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:26.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:26.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:27.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:27.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:27.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:27.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:27.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:27.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:27.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:27.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:27.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:27.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:27.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:27.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:29.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:29.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:29.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:29.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:29.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:29.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:29.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:29.835 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:29.835 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:29.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:29.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:29.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:32.020 134138 DEBUG oslo_service.periodic_task [req-9ab282b9-f74f-414c-9dce-9fccfc90d9a3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:07:32.024 134138 DEBUG oslo_concurrency.lockutils [req-43c4ed8a-7c44-483b-95c0-fb2a6c3dff21 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:07:32.025 134138 DEBUG oslo_concurrency.lockutils [req-43c4ed8a-7c44-483b-95c0-fb2a6c3dff21 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:07:33.063 134145 DEBUG oslo_service.periodic_task [req-f22c95d3-e7c1-4adb-a3d6-02619467daa8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:07:33.068 134145 DEBUG oslo_concurrency.lockutils [req-4090811b-7870-4452-8d83-68bcb52e7f80 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:07:33.069 134145 DEBUG oslo_concurrency.lockutils [req-4090811b-7870-4452-8d83-68bcb52e7f80 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:07:33.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:33.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:33.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:33.836 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:33.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:33.837 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:33.837 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:33.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:33.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:33.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:33.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:33.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:41.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:41.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:41.845 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:41.845 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:41.845 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:41.845 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:41.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:41.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:41.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:41.849 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:41.849 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:41.850 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:44.068 134146 DEBUG oslo_service.periodic_task [req-3ee0e733-b400-4145-8f64-e1e1cc2fff7b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:07:44.073 134146 DEBUG oslo_concurrency.lockutils [req-76a0115b-67d0-481f-8267-ced0bd63261d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:07:44.073 134146 DEBUG oslo_concurrency.lockutils [req-76a0115b-67d0-481f-8267-ced0bd63261d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:07:56.102 134140 DEBUG oslo_service.periodic_task [req-5deca433-2721-4a3b-8e8b-2ec6f41a8f27 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:07:56.107 134140 DEBUG oslo_concurrency.lockutils [req-becf9f27-4844-44ac-8b2b-1a27e4dc64ce - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:07:56.107 134140 DEBUG oslo_concurrency.lockutils [req-becf9f27-4844-44ac-8b2b-1a27e4dc64ce - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:07:57.845 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:57.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:57.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:57.847 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:57.847 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:57.847 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:57.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:57.848 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:57.848 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:07:57.852 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:07:57.852 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:07:57.852 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:08:02.031 134138 DEBUG oslo_service.periodic_task [req-43c4ed8a-7c44-483b-95c0-fb2a6c3dff21 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:08:02.035 134138 DEBUG oslo_concurrency.lockutils [req-84b7499d-4754-4a40-8aa0-31040dd2be03 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:08:02.035 134138 DEBUG oslo_concurrency.lockutils [req-84b7499d-4754-4a40-8aa0-31040dd2be03 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:08:04.053 134145 DEBUG oslo_service.periodic_task [req-4090811b-7870-4452-8d83-68bcb52e7f80 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:08:04.057 134145 DEBUG oslo_concurrency.lockutils [req-27767b7d-9d7f-4759-b552-69c97dc7fcee - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:08:04.057 134145 DEBUG oslo_concurrency.lockutils [req-27767b7d-9d7f-4759-b552-69c97dc7fcee - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:08:14.079 134146 DEBUG oslo_service.periodic_task [req-76a0115b-67d0-481f-8267-ced0bd63261d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:08:14.083 134146 DEBUG oslo_concurrency.lockutils [req-8ef90347-3670-43d4-b345-522895f031b0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:08:14.083 134146 DEBUG oslo_concurrency.lockutils [req-8ef90347-3670-43d4-b345-522895f031b0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:08:26.115 134140 DEBUG oslo_service.periodic_task [req-becf9f27-4844-44ac-8b2b-1a27e4dc64ce - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:08:26.119 134140 DEBUG oslo_concurrency.lockutils [req-b79fa596-9e6f-4597-a896-3fbaae05b8af - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:08:26.119 134140 DEBUG oslo_concurrency.lockutils [req-b79fa596-9e6f-4597-a896-3fbaae05b8af - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:08:32.045 134138 DEBUG oslo_service.periodic_task [req-84b7499d-4754-4a40-8aa0-31040dd2be03 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:08:32.049 134138 DEBUG oslo_concurrency.lockutils [req-0afeeb39-86a7-4601-a2a0-6c882b365a41 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:08:32.049 134138 DEBUG oslo_concurrency.lockutils [req-0afeeb39-86a7-4601-a2a0-6c882b365a41 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:08:33.038 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:08:33.038 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:08:33.038 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:08:33.081 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:08:33.081 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:08:33.082 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:08:33.083 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:08:33.083 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:08:33.084 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:08:33.107 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:08:33.107 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:08:33.107 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:08:34.063 134145 DEBUG oslo_service.periodic_task [req-27767b7d-9d7f-4759-b552-69c97dc7fcee - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:08:34.067 134145 DEBUG oslo_concurrency.lockutils [req-bc2ce003-e0a2-4dde-96c9-c9783c2ea33a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:08:34.068 134145 DEBUG oslo_concurrency.lockutils [req-bc2ce003-e0a2-4dde-96c9-c9783c2ea33a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:08:45.070 134146 DEBUG oslo_service.periodic_task [req-8ef90347-3670-43d4-b345-522895f031b0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:08:45.074 134146 DEBUG oslo_concurrency.lockutils [req-5051dcde-df7e-4398-b221-7f6ded6a194a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:08:45.074 134146 DEBUG oslo_concurrency.lockutils [req-5051dcde-df7e-4398-b221-7f6ded6a194a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:08:56.125 134140 DEBUG oslo_service.periodic_task [req-b79fa596-9e6f-4597-a896-3fbaae05b8af - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:08:56.130 134140 DEBUG oslo_concurrency.lockutils [req-9c49030b-99bd-4559-991f-4f5049eb4a1d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:08:56.130 134140 DEBUG oslo_concurrency.lockutils [req-9c49030b-99bd-4559-991f-4f5049eb4a1d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:09:02.062 134138 DEBUG oslo_service.periodic_task [req-0afeeb39-86a7-4601-a2a0-6c882b365a41 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:09:02.067 134138 DEBUG oslo_concurrency.lockutils [req-e40963e0-e350-44c5-ba36-6c8f2f35fa38 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:09:02.067 134138 DEBUG oslo_concurrency.lockutils [req-e40963e0-e350-44c5-ba36-6c8f2f35fa38 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:09:04.072 134145 DEBUG oslo_service.periodic_task [req-bc2ce003-e0a2-4dde-96c9-c9783c2ea33a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:09:04.077 134145 DEBUG oslo_concurrency.lockutils [req-d32a3513-55ff-4aa3-b5e0-119ed8f48586 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:09:04.077 134145 DEBUG oslo_concurrency.lockutils [req-d32a3513-55ff-4aa3-b5e0-119ed8f48586 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:09:16.069 134146 DEBUG oslo_service.periodic_task [req-5051dcde-df7e-4398-b221-7f6ded6a194a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:09:16.074 134146 DEBUG oslo_concurrency.lockutils [req-d4323e60-3cff-40db-a05e-1401709882f6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:09:16.074 134146 DEBUG oslo_concurrency.lockutils [req-d4323e60-3cff-40db-a05e-1401709882f6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:09:26.137 134140 DEBUG oslo_service.periodic_task [req-9c49030b-99bd-4559-991f-4f5049eb4a1d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:09:26.141 134140 DEBUG oslo_concurrency.lockutils [req-31039132-0a60-4d39-a52a-67463bf66c56 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:09:26.141 134140 DEBUG oslo_concurrency.lockutils [req-31039132-0a60-4d39-a52a-67463bf66c56 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:09:28.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 38130a289e364967b458ff3479aa2609 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:09:28.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 38130a289e364967b458ff3479aa2609 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:09:28.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:28.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 38130a289e364967b458ff3479aa2609 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:09:28.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:28.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:28.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:28.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 38130a289e364967b458ff3479aa2609 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:09:28.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 38130a289e364967b458ff3479aa2609 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:09:28.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:28.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:28.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:28.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 38130a289e364967b458ff3479aa2609 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:09:28.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:28.829 134145 DEBUG oslo_concurrency.lockutils [req-5b691b3d-2a22-4316-83a0-4b5bb496be77 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:09:28.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:28.829 134145 DEBUG nova.scheduler.host_manager [req-5b691b3d-2a22-4316-83a0-4b5bb496be77 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:09:28.829 134145 DEBUG oslo_concurrency.lockutils [req-5b691b3d-2a22-4316-83a0-4b5bb496be77 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:09:28.829 134138 DEBUG oslo_concurrency.lockutils [req-5b691b3d-2a22-4316-83a0-4b5bb496be77 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:09:28.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:28.829 134138 DEBUG nova.scheduler.host_manager [req-5b691b3d-2a22-4316-83a0-4b5bb496be77 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:09:28.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:28.829 134146 DEBUG oslo_concurrency.lockutils [req-5b691b3d-2a22-4316-83a0-4b5bb496be77 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:09:28.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:28.830 134138 DEBUG oslo_concurrency.lockutils [req-5b691b3d-2a22-4316-83a0-4b5bb496be77 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:09:28.830 134146 DEBUG nova.scheduler.host_manager [req-5b691b3d-2a22-4316-83a0-4b5bb496be77 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:09:28.830 134146 DEBUG oslo_concurrency.lockutils [req-5b691b3d-2a22-4316-83a0-4b5bb496be77 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:09:28.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:28.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:28.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:28.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:28.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:28.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:28.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 38130a289e364967b458ff3479aa2609 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:09:28.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:28.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 38130a289e364967b458ff3479aa2609 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:09:28.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:28.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:28.832 134140 DEBUG oslo_concurrency.lockutils [req-5b691b3d-2a22-4316-83a0-4b5bb496be77 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:09:28.833 134140 DEBUG nova.scheduler.host_manager [req-5b691b3d-2a22-4316-83a0-4b5bb496be77 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:09:28.833 134140 DEBUG oslo_concurrency.lockutils [req-5b691b3d-2a22-4316-83a0-4b5bb496be77 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:09:28.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:28.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:28.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:29.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:29.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:29.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:29.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:29.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:29.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:29.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:29.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:29.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:29.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:29.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:29.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:31.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:31.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:31.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:31.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:31.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:31.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:31.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:31.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:31.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:31.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:31.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:31.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:32.073 134138 DEBUG oslo_service.periodic_task [req-e40963e0-e350-44c5-ba36-6c8f2f35fa38 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:09:32.079 134138 DEBUG oslo_concurrency.lockutils [req-7646126e-f61f-446f-b508-ccb8fbc79853 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:09:32.079 134138 DEBUG oslo_concurrency.lockutils [req-7646126e-f61f-446f-b508-ccb8fbc79853 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:09:34.082 134145 DEBUG oslo_service.periodic_task [req-d32a3513-55ff-4aa3-b5e0-119ed8f48586 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:09:34.087 134145 DEBUG oslo_concurrency.lockutils [req-a178569d-f6ce-443d-9b5f-96d5d88bb6f3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:09:34.087 134145 DEBUG oslo_concurrency.lockutils [req-a178569d-f6ce-443d-9b5f-96d5d88bb6f3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:09:35.835 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:35.835 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:35.835 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:35.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:35.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:35.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:35.837 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:35.838 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:35.838 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:35.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:35.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:35.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:43.843 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:43.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:43.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:43.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:43.845 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:43.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:43.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:43.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:43.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:43.849 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:43.849 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:43.850 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:47.070 134146 DEBUG oslo_service.periodic_task [req-d4323e60-3cff-40db-a05e-1401709882f6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:09:47.074 134146 DEBUG oslo_concurrency.lockutils [req-b30bd4c2-e293-4754-8bb5-fc0e8fb867cc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:09:47.074 134146 DEBUG oslo_concurrency.lockutils [req-b30bd4c2-e293-4754-8bb5-fc0e8fb867cc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:09:56.148 134140 DEBUG oslo_service.periodic_task [req-31039132-0a60-4d39-a52a-67463bf66c56 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:09:56.153 134140 DEBUG oslo_concurrency.lockutils [req-ea8bff8d-eaf5-4c36-9357-58d3396a88a3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:09:56.153 134140 DEBUG oslo_concurrency.lockutils [req-ea8bff8d-eaf5-4c36-9357-58d3396a88a3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:09:59.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:59.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:59.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:59.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:59.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:59.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:59.848 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:59.849 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:59.849 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:09:59.852 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:09:59.852 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:09:59.852 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:10:02.084 134138 DEBUG oslo_service.periodic_task [req-7646126e-f61f-446f-b508-ccb8fbc79853 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:10:02.088 134138 DEBUG oslo_concurrency.lockutils [req-df5da850-466c-47e2-bef4-8cc47461d08d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:10:02.089 134138 DEBUG oslo_concurrency.lockutils [req-df5da850-466c-47e2-bef4-8cc47461d08d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:10:05.052 134145 DEBUG oslo_service.periodic_task [req-a178569d-f6ce-443d-9b5f-96d5d88bb6f3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:10:05.056 134145 DEBUG oslo_concurrency.lockutils [req-e1e20083-aa25-48cd-a023-27b7dea453db - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:10:05.057 134145 DEBUG oslo_concurrency.lockutils [req-e1e20083-aa25-48cd-a023-27b7dea453db - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:10:18.069 134146 DEBUG oslo_service.periodic_task [req-b30bd4c2-e293-4754-8bb5-fc0e8fb867cc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:10:18.074 134146 DEBUG oslo_concurrency.lockutils [req-2993ce08-fac9-4050-b543-31aebe721312 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:10:18.074 134146 DEBUG oslo_concurrency.lockutils [req-2993ce08-fac9-4050-b543-31aebe721312 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:10:27.052 134140 DEBUG oslo_service.periodic_task [req-ea8bff8d-eaf5-4c36-9357-58d3396a88a3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:10:27.056 134140 DEBUG oslo_concurrency.lockutils [req-0eee0d3f-4460-45cc-8e8e-1344769a3a11 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:10:27.057 134140 DEBUG oslo_concurrency.lockutils [req-0eee0d3f-4460-45cc-8e8e-1344769a3a11 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:10:32.096 134138 DEBUG oslo_service.periodic_task [req-df5da850-466c-47e2-bef4-8cc47461d08d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:10:32.100 134138 DEBUG oslo_concurrency.lockutils [req-3ccc16c7-2db2-4e76-b0f3-46c2144adcd3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:10:32.101 134138 DEBUG oslo_concurrency.lockutils [req-3ccc16c7-2db2-4e76-b0f3-46c2144adcd3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:10:33.042 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:10:33.042 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:10:33.042 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:10:33.081 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:10:33.081 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:10:33.081 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:10:33.086 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:10:33.086 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:10:33.086 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:10:33.110 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:10:33.110 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:10:33.111 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:10:35.062 134145 DEBUG oslo_service.periodic_task [req-e1e20083-aa25-48cd-a023-27b7dea453db - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:10:35.066 134145 DEBUG oslo_concurrency.lockutils [req-fc778259-a95e-4861-945c-083670b78526 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:10:35.066 134145 DEBUG oslo_concurrency.lockutils [req-fc778259-a95e-4861-945c-083670b78526 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:10:49.069 134146 DEBUG oslo_service.periodic_task [req-2993ce08-fac9-4050-b543-31aebe721312 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:10:49.073 134146 DEBUG oslo_concurrency.lockutils [req-f213219b-629d-4832-9614-3fc048f688fa - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:10:49.074 134146 DEBUG oslo_concurrency.lockutils [req-f213219b-629d-4832-9614-3fc048f688fa - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:10:57.063 134140 DEBUG oslo_service.periodic_task [req-0eee0d3f-4460-45cc-8e8e-1344769a3a11 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:10:57.068 134140 DEBUG oslo_concurrency.lockutils [req-88dab50f-c18b-407d-a6d2-e34601bce115 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:10:57.068 134140 DEBUG oslo_concurrency.lockutils [req-88dab50f-c18b-407d-a6d2-e34601bce115 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:11:03.019 134138 DEBUG oslo_service.periodic_task [req-3ccc16c7-2db2-4e76-b0f3-46c2144adcd3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:11:03.024 134138 DEBUG oslo_concurrency.lockutils [req-8730540f-ab6f-4728-bf90-fb1ee1d6cfe1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:11:03.024 134138 DEBUG oslo_concurrency.lockutils [req-8730540f-ab6f-4728-bf90-fb1ee1d6cfe1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:11:06.052 134145 DEBUG oslo_service.periodic_task [req-fc778259-a95e-4861-945c-083670b78526 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:11:06.056 134145 DEBUG oslo_concurrency.lockutils [req-97866806-f017-4a6d-83b9-2f66b1206112 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:11:06.056 134145 DEBUG oslo_concurrency.lockutils [req-97866806-f017-4a6d-83b9-2f66b1206112 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:11:20.070 134146 DEBUG oslo_service.periodic_task [req-f213219b-629d-4832-9614-3fc048f688fa - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:11:20.074 134146 DEBUG oslo_concurrency.lockutils [req-42918f01-e586-440d-b997-aea748ba713e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:11:20.074 134146 DEBUG oslo_concurrency.lockutils [req-42918f01-e586-440d-b997-aea748ba713e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:11:28.052 134140 DEBUG oslo_service.periodic_task [req-88dab50f-c18b-407d-a6d2-e34601bce115 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:11:28.056 134140 DEBUG oslo_concurrency.lockutils [req-97af78f0-4582-40f9-9b7c-ccd614e4acbf - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:11:28.056 134140 DEBUG oslo_concurrency.lockutils [req-97af78f0-4582-40f9-9b7c-ccd614e4acbf - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:11:28.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1b39378164bb424faeadb3d943a73c72 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:11:28.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1b39378164bb424faeadb3d943a73c72 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:11:28.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1b39378164bb424faeadb3d943a73c72 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:11:28.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1b39378164bb424faeadb3d943a73c72 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:11:28.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:28.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:28.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1b39378164bb424faeadb3d943a73c72 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:11:28.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:28.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1b39378164bb424faeadb3d943a73c72 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:11:28.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:28.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1b39378164bb424faeadb3d943a73c72 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:11:28.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:28.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1b39378164bb424faeadb3d943a73c72 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:11:28.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:28.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:28.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:28.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:28.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:28.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:28.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:28.827 134146 DEBUG oslo_concurrency.lockutils [req-4548d427-e8ba-4fe0-b68f-ea612097af71 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:11:28.827 134145 DEBUG oslo_concurrency.lockutils [req-4548d427-e8ba-4fe0-b68f-ea612097af71 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:11:28.828 134146 DEBUG nova.scheduler.host_manager [req-4548d427-e8ba-4fe0-b68f-ea612097af71 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:11:28.828 134145 DEBUG nova.scheduler.host_manager [req-4548d427-e8ba-4fe0-b68f-ea612097af71 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:11:28.828 134138 DEBUG oslo_concurrency.lockutils [req-4548d427-e8ba-4fe0-b68f-ea612097af71 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:11:28.828 134140 DEBUG oslo_concurrency.lockutils [req-4548d427-e8ba-4fe0-b68f-ea612097af71 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:11:28.828 134145 DEBUG oslo_concurrency.lockutils [req-4548d427-e8ba-4fe0-b68f-ea612097af71 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:11:28.828 134146 DEBUG oslo_concurrency.lockutils [req-4548d427-e8ba-4fe0-b68f-ea612097af71 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:11:28.828 134140 DEBUG nova.scheduler.host_manager [req-4548d427-e8ba-4fe0-b68f-ea612097af71 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:11:28.828 134138 DEBUG nova.scheduler.host_manager [req-4548d427-e8ba-4fe0-b68f-ea612097af71 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:11:28.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:28.828 134140 DEBUG oslo_concurrency.lockutils [req-4548d427-e8ba-4fe0-b68f-ea612097af71 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:11:28.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:28.828 134138 DEBUG oslo_concurrency.lockutils [req-4548d427-e8ba-4fe0-b68f-ea612097af71 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:11:28.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:28.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:28.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:28.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:28.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:28.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:28.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:28.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:28.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:28.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:29.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:29.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:29.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:29.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:29.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:29.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:29.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:29.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:29.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:29.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:29.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:29.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:31.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:31.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:31.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:31.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:31.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:31.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:31.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:31.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:31.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:31.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:31.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:31.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:33.030 134138 DEBUG oslo_service.periodic_task [req-8730540f-ab6f-4728-bf90-fb1ee1d6cfe1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:11:33.036 134138 DEBUG oslo_concurrency.lockutils [req-c359cc22-2968-4752-ba65-4abbc0b9f1b9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:11:33.036 134138 DEBUG oslo_concurrency.lockutils [req-c359cc22-2968-4752-ba65-4abbc0b9f1b9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:11:35.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:35.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:35.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:35.835 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:35.835 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:35.835 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:35.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:35.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:35.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:35.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:35.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:35.837 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:37.052 134145 DEBUG oslo_service.periodic_task [req-97866806-f017-4a6d-83b9-2f66b1206112 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:11:37.057 134145 DEBUG oslo_concurrency.lockutils [req-b062c19c-04aa-4c8e-9240-b57587b09c9e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:11:37.057 134145 DEBUG oslo_concurrency.lockutils [req-b062c19c-04aa-4c8e-9240-b57587b09c9e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:11:43.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:43.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:43.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:43.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:43.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:43.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:43.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:43.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:43.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:43.845 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:43.845 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:43.846 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:51.069 134146 DEBUG oslo_service.periodic_task [req-42918f01-e586-440d-b997-aea748ba713e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:11:51.073 134146 DEBUG oslo_concurrency.lockutils [req-54a71546-d856-4d27-a0b7-970bbd708e62 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:11:51.073 134146 DEBUG oslo_concurrency.lockutils [req-54a71546-d856-4d27-a0b7-970bbd708e62 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:11:58.063 134140 DEBUG oslo_service.periodic_task [req-97af78f0-4582-40f9-9b7c-ccd614e4acbf - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:11:58.068 134140 DEBUG oslo_concurrency.lockutils [req-c13faf21-3e99-4609-9456-9bb0df65fae7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:11:58.068 134140 DEBUG oslo_concurrency.lockutils [req-c13faf21-3e99-4609-9456-9bb0df65fae7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:11:59.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:59.845 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:59.844 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:59.845 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:59.845 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:59.845 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:59.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:59.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:59.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:11:59.848 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:11:59.848 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:11:59.848 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:12:03.043 134138 DEBUG oslo_service.periodic_task [req-c359cc22-2968-4752-ba65-4abbc0b9f1b9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:12:03.047 134138 DEBUG oslo_concurrency.lockutils [req-18cbccd5-e15e-45a5-8667-ab984b45d517 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:12:03.048 134138 DEBUG oslo_concurrency.lockutils [req-18cbccd5-e15e-45a5-8667-ab984b45d517 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:12:07.063 134145 DEBUG oslo_service.periodic_task [req-b062c19c-04aa-4c8e-9240-b57587b09c9e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:12:07.067 134145 DEBUG oslo_concurrency.lockutils [req-99d4cbd8-553c-4d40-b3e5-83b2bc5a9833 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:12:07.068 134145 DEBUG oslo_concurrency.lockutils [req-99d4cbd8-553c-4d40-b3e5-83b2bc5a9833 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:12:22.069 134146 DEBUG oslo_service.periodic_task [req-54a71546-d856-4d27-a0b7-970bbd708e62 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:12:22.074 134146 DEBUG oslo_concurrency.lockutils [req-99bfaeb0-3743-464f-8831-b36dce3d5cd6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:12:22.074 134146 DEBUG oslo_concurrency.lockutils [req-99bfaeb0-3743-464f-8831-b36dce3d5cd6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:12:29.053 134140 DEBUG oslo_service.periodic_task [req-c13faf21-3e99-4609-9456-9bb0df65fae7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:12:29.057 134140 DEBUG oslo_concurrency.lockutils [req-a2295b12-4802-42d9-a429-45da7ee2f578 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:12:29.057 134140 DEBUG oslo_concurrency.lockutils [req-a2295b12-4802-42d9-a429-45da7ee2f578 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:12:33.048 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:12:33.049 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:12:33.049 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:12:33.086 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:12:33.086 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:12:33.086 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:12:33.090 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:12:33.090 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:12:33.090 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:12:33.113 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:12:33.113 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:12:33.113 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:12:34.020 134138 DEBUG oslo_service.periodic_task [req-18cbccd5-e15e-45a5-8667-ab984b45d517 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:12:34.024 134138 DEBUG oslo_concurrency.lockutils [req-95420c09-f362-48bf-a783-08f2d16f9eda - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:12:34.024 134138 DEBUG oslo_concurrency.lockutils [req-95420c09-f362-48bf-a783-08f2d16f9eda - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:12:37.074 134145 DEBUG oslo_service.periodic_task [req-99d4cbd8-553c-4d40-b3e5-83b2bc5a9833 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:12:37.079 134145 DEBUG oslo_concurrency.lockutils [req-95bc9d20-74c2-4d8c-b5ba-f943b7cb716f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:12:37.079 134145 DEBUG oslo_concurrency.lockutils [req-95bc9d20-74c2-4d8c-b5ba-f943b7cb716f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:12:53.069 134146 DEBUG oslo_service.periodic_task [req-99bfaeb0-3743-464f-8831-b36dce3d5cd6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:12:53.074 134146 DEBUG oslo_concurrency.lockutils [req-c1ef613c-ef23-4846-9c66-6fb7882fb5aa - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:12:53.074 134146 DEBUG oslo_concurrency.lockutils [req-c1ef613c-ef23-4846-9c66-6fb7882fb5aa - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:12:59.066 134140 DEBUG oslo_service.periodic_task [req-a2295b12-4802-42d9-a429-45da7ee2f578 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:12:59.070 134140 DEBUG oslo_concurrency.lockutils [req-aaa4dafa-47d0-48a6-9a6b-dbf9d6e3de13 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:12:59.070 134140 DEBUG oslo_concurrency.lockutils [req-aaa4dafa-47d0-48a6-9a6b-dbf9d6e3de13 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:13:04.030 134138 DEBUG oslo_service.periodic_task [req-95420c09-f362-48bf-a783-08f2d16f9eda - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:13:04.034 134138 DEBUG oslo_concurrency.lockutils [req-7e4e2230-e890-4408-a61b-e97e9e886ead - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:13:04.034 134138 DEBUG oslo_concurrency.lockutils [req-7e4e2230-e890-4408-a61b-e97e9e886ead - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:13:08.055 134145 DEBUG oslo_service.periodic_task [req-95bc9d20-74c2-4d8c-b5ba-f943b7cb716f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:13:08.058 134145 DEBUG oslo_concurrency.lockutils [req-79129eea-4597-46db-9784-7bd42285b66c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:13:08.059 134145 DEBUG oslo_concurrency.lockutils [req-79129eea-4597-46db-9784-7bd42285b66c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:13:23.084 134146 DEBUG oslo_service.periodic_task [req-c1ef613c-ef23-4846-9c66-6fb7882fb5aa - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:13:23.088 134146 DEBUG oslo_concurrency.lockutils [req-b9da401c-5ba8-47ef-bde2-56973c1fb058 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:13:23.088 134146 DEBUG oslo_concurrency.lockutils [req-b9da401c-5ba8-47ef-bde2-56973c1fb058 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:13:29.079 134140 DEBUG oslo_service.periodic_task [req-aaa4dafa-47d0-48a6-9a6b-dbf9d6e3de13 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:13:29.085 134140 DEBUG oslo_concurrency.lockutils [req-0a3def3f-b165-4faf-a585-fcd2b87aacdc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:13:29.085 134140 DEBUG oslo_concurrency.lockutils [req-0a3def3f-b165-4faf-a585-fcd2b87aacdc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:13:31.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 098b9cb4474e44eb920a969666757be1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:13:31.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 098b9cb4474e44eb920a969666757be1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:13:31.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 098b9cb4474e44eb920a969666757be1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:13:31.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 098b9cb4474e44eb920a969666757be1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:13:31.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:31.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:31.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:31.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:31.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 098b9cb4474e44eb920a969666757be1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:13:31.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 098b9cb4474e44eb920a969666757be1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:13:31.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 098b9cb4474e44eb920a969666757be1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:13:31.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 098b9cb4474e44eb920a969666757be1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:13:31.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:31.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:31.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:31.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:31.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:31.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:31.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:31.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:31.827 134146 DEBUG oslo_concurrency.lockutils [req-aa5a0e50-5645-43f9-9a13-f953a387e109 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:13:31.828 134140 DEBUG oslo_concurrency.lockutils [req-aa5a0e50-5645-43f9-9a13-f953a387e109 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:13:31.828 134138 DEBUG oslo_concurrency.lockutils [req-aa5a0e50-5645-43f9-9a13-f953a387e109 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:13:31.828 134146 DEBUG nova.scheduler.host_manager [req-aa5a0e50-5645-43f9-9a13-f953a387e109 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:13:31.828 134140 DEBUG nova.scheduler.host_manager [req-aa5a0e50-5645-43f9-9a13-f953a387e109 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:13:31.828 134145 DEBUG oslo_concurrency.lockutils [req-aa5a0e50-5645-43f9-9a13-f953a387e109 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:13:31.828 134138 DEBUG nova.scheduler.host_manager [req-aa5a0e50-5645-43f9-9a13-f953a387e109 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:13:31.828 134146 DEBUG oslo_concurrency.lockutils [req-aa5a0e50-5645-43f9-9a13-f953a387e109 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:13:31.828 134140 DEBUG oslo_concurrency.lockutils [req-aa5a0e50-5645-43f9-9a13-f953a387e109 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:13:31.828 134145 DEBUG nova.scheduler.host_manager [req-aa5a0e50-5645-43f9-9a13-f953a387e109 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:13:31.828 134138 DEBUG oslo_concurrency.lockutils [req-aa5a0e50-5645-43f9-9a13-f953a387e109 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:13:31.828 134145 DEBUG oslo_concurrency.lockutils [req-aa5a0e50-5645-43f9-9a13-f953a387e109 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:13:31.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:31.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:31.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:31.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:31.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:31.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:31.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:31.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:31.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:31.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:31.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:31.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:32.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:32.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:32.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:32.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:32.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:32.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:32.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:32.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:32.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:32.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:32.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:32.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:34.040 134138 DEBUG oslo_service.periodic_task [req-7e4e2230-e890-4408-a61b-e97e9e886ead - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:13:34.045 134138 DEBUG oslo_concurrency.lockutils [req-07702178-8533-433d-8861-53e6dce21cc3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:13:34.045 134138 DEBUG oslo_concurrency.lockutils [req-07702178-8533-433d-8861-53e6dce21cc3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:13:34.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:34.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:34.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:34.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:34.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:34.833 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:34.833 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:34.833 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:34.833 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:34.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:34.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:34.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:38.068 134145 DEBUG oslo_service.periodic_task [req-79129eea-4597-46db-9784-7bd42285b66c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:13:38.072 134145 DEBUG oslo_concurrency.lockutils [req-34a121d5-1966-422e-90ad-ceed55e86e79 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:13:38.072 134145 DEBUG oslo_concurrency.lockutils [req-34a121d5-1966-422e-90ad-ceed55e86e79 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:13:38.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:38.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:38.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:38.837 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:38.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:38.837 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:38.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:38.838 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:38.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:38.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:38.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:38.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:46.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:46.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:46.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:46.840 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:46.840 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:46.840 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:46.841 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:46.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:13:46.841 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:46.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:13:46.841 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:46.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:13:54.069 134146 DEBUG oslo_service.periodic_task [req-b9da401c-5ba8-47ef-bde2-56973c1fb058 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:13:54.073 134146 DEBUG oslo_concurrency.lockutils [req-97c4e7a6-714f-4200-9b10-4de62e925400 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:13:54.073 134146 DEBUG oslo_concurrency.lockutils [req-97c4e7a6-714f-4200-9b10-4de62e925400 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:14:00.053 134140 DEBUG oslo_service.periodic_task [req-0a3def3f-b165-4faf-a585-fcd2b87aacdc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:14:00.057 134140 DEBUG oslo_concurrency.lockutils [req-c7048c0d-7c6c-4add-83c7-f55e8c3287cc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:14:00.058 134140 DEBUG oslo_concurrency.lockutils [req-c7048c0d-7c6c-4add-83c7-f55e8c3287cc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:14:02.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:14:02.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:14:02.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:14:02.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:14:02.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:14:02.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:14:02.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:14:02.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:14:02.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:14:02.843 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:14:02.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:14:02.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:14:04.050 134138 DEBUG oslo_service.periodic_task [req-07702178-8533-433d-8861-53e6dce21cc3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:14:04.055 134138 DEBUG oslo_concurrency.lockutils [req-fa91ed68-1297-47aa-a594-ff0297302445 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:14:04.055 134138 DEBUG oslo_concurrency.lockutils [req-fa91ed68-1297-47aa-a594-ff0297302445 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:14:08.081 134145 DEBUG oslo_service.periodic_task [req-34a121d5-1966-422e-90ad-ceed55e86e79 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:14:08.085 134145 DEBUG oslo_concurrency.lockutils [req-cbfbddac-bc50-486a-85e5-8bb4a8d9cca8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:14:08.085 134145 DEBUG oslo_concurrency.lockutils [req-cbfbddac-bc50-486a-85e5-8bb4a8d9cca8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:14:25.069 134146 DEBUG oslo_service.periodic_task [req-97c4e7a6-714f-4200-9b10-4de62e925400 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:14:25.073 134146 DEBUG oslo_concurrency.lockutils [req-ea7df869-8129-487a-8577-d27ba3e19c63 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:14:25.073 134146 DEBUG oslo_concurrency.lockutils [req-ea7df869-8129-487a-8577-d27ba3e19c63 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:14:31.051 134140 DEBUG oslo_service.periodic_task [req-c7048c0d-7c6c-4add-83c7-f55e8c3287cc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:14:31.055 134140 DEBUG oslo_concurrency.lockutils [req-1e50b859-1912-40ae-9e46-e32bed206320 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:14:31.056 134140 DEBUG oslo_concurrency.lockutils [req-1e50b859-1912-40ae-9e46-e32bed206320 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:14:34.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:14:34.841 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:14:34.841 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:14:34.844 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:14:34.844 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:14:34.845 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:14:34.845 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:14:34.845 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:14:34.845 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:14:34.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:14:34.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:14:34.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:14:35.019 134138 DEBUG oslo_service.periodic_task [req-fa91ed68-1297-47aa-a594-ff0297302445 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:14:35.024 134138 DEBUG oslo_concurrency.lockutils [req-d833772a-40dd-4bea-b604-99d78e33f046 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:14:35.024 134138 DEBUG oslo_concurrency.lockutils [req-d833772a-40dd-4bea-b604-99d78e33f046 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:14:38.092 134145 DEBUG oslo_service.periodic_task [req-cbfbddac-bc50-486a-85e5-8bb4a8d9cca8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:14:38.097 134145 DEBUG oslo_concurrency.lockutils [req-64865152-be2c-47b1-9d15-446559351581 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:14:38.098 134145 DEBUG oslo_concurrency.lockutils [req-64865152-be2c-47b1-9d15-446559351581 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:14:55.078 134146 DEBUG oslo_service.periodic_task [req-ea7df869-8129-487a-8577-d27ba3e19c63 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:14:55.082 134146 DEBUG oslo_concurrency.lockutils [req-b538fc83-512e-4358-8dcb-802adc65cb54 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:14:55.082 134146 DEBUG oslo_concurrency.lockutils [req-b538fc83-512e-4358-8dcb-802adc65cb54 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:15:01.067 134140 DEBUG oslo_service.periodic_task [req-1e50b859-1912-40ae-9e46-e32bed206320 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:15:01.072 134140 DEBUG oslo_concurrency.lockutils [req-95ce341b-0aa3-4f31-ba65-9469f3bed531 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:15:01.073 134140 DEBUG oslo_concurrency.lockutils [req-95ce341b-0aa3-4f31-ba65-9469f3bed531 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:15:05.029 134138 DEBUG oslo_service.periodic_task [req-d833772a-40dd-4bea-b604-99d78e33f046 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:15:05.033 134138 DEBUG oslo_concurrency.lockutils [req-0ef40d25-af86-4031-b8c5-d5a4ca273480 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:15:05.034 134138 DEBUG oslo_concurrency.lockutils [req-0ef40d25-af86-4031-b8c5-d5a4ca273480 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:15:08.105 134145 DEBUG oslo_service.periodic_task [req-64865152-be2c-47b1-9d15-446559351581 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:15:08.110 134145 DEBUG oslo_concurrency.lockutils [req-9b60c83b-533d-4145-8964-9780d1ac801f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:15:08.110 134145 DEBUG oslo_concurrency.lockutils [req-9b60c83b-533d-4145-8964-9780d1ac801f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:15:26.069 134146 DEBUG oslo_service.periodic_task [req-b538fc83-512e-4358-8dcb-802adc65cb54 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:15:26.074 134146 DEBUG oslo_concurrency.lockutils [req-9f011f5d-c856-4648-a2bb-8665bf8a3e3e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:15:26.074 134146 DEBUG oslo_concurrency.lockutils [req-9f011f5d-c856-4648-a2bb-8665bf8a3e3e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:15:31.084 134140 DEBUG oslo_service.periodic_task [req-95ce341b-0aa3-4f31-ba65-9469f3bed531 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:15:31.088 134140 DEBUG oslo_concurrency.lockutils [req-d95f220d-2cde-4930-a427-b70fc55abc34 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:15:31.088 134140 DEBUG oslo_concurrency.lockutils [req-d95f220d-2cde-4930-a427-b70fc55abc34 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:15:35.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5fa4d972807c44b3a288bbb24c81f1e7 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:15:35.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5fa4d972807c44b3a288bbb24c81f1e7 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:15:35.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5fa4d972807c44b3a288bbb24c81f1e7 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:15:35.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5fa4d972807c44b3a288bbb24c81f1e7 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:15:35.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:35.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:35.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5fa4d972807c44b3a288bbb24c81f1e7 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:15:35.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5fa4d972807c44b3a288bbb24c81f1e7 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:15:35.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:35.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:35.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:35.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:35.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5fa4d972807c44b3a288bbb24c81f1e7 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:15:35.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:35.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5fa4d972807c44b3a288bbb24c81f1e7 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:15:35.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:35.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:35.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:35.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:35.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:35.829 134146 DEBUG oslo_concurrency.lockutils [req-d8771624-261d-4860-8de2-f07fb34c231e - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:15:35.829 134140 DEBUG oslo_concurrency.lockutils [req-d8771624-261d-4860-8de2-f07fb34c231e - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:15:35.829 134146 DEBUG nova.scheduler.host_manager [req-d8771624-261d-4860-8de2-f07fb34c231e - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:15:35.829 134146 DEBUG oslo_concurrency.lockutils [req-d8771624-261d-4860-8de2-f07fb34c231e - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:15:35.829 134140 DEBUG nova.scheduler.host_manager [req-d8771624-261d-4860-8de2-f07fb34c231e - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:15:35.829 134145 DEBUG oslo_concurrency.lockutils [req-d8771624-261d-4860-8de2-f07fb34c231e - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:15:35.829 134138 DEBUG oslo_concurrency.lockutils [req-d8771624-261d-4860-8de2-f07fb34c231e - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:15:35.829 134140 DEBUG oslo_concurrency.lockutils [req-d8771624-261d-4860-8de2-f07fb34c231e - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:15:35.829 134145 DEBUG nova.scheduler.host_manager [req-d8771624-261d-4860-8de2-f07fb34c231e - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:15:35.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:35.829 134138 DEBUG nova.scheduler.host_manager [req-d8771624-261d-4860-8de2-f07fb34c231e - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:15:35.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:35.830 134145 DEBUG oslo_concurrency.lockutils [req-d8771624-261d-4860-8de2-f07fb34c231e - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:15:35.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:35.830 134138 DEBUG oslo_concurrency.lockutils [req-d8771624-261d-4860-8de2-f07fb34c231e - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:15:35.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:35.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:35.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:35.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:35.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:35.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:35.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:35.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:35.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:36.019 134138 DEBUG oslo_service.periodic_task [req-0ef40d25-af86-4031-b8c5-d5a4ca273480 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:15:36.024 134138 DEBUG oslo_concurrency.lockutils [req-51b30ca8-28e5-4151-9d2e-d0cb73a490a9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:15:36.024 134138 DEBUG oslo_concurrency.lockutils [req-51b30ca8-28e5-4151-9d2e-d0cb73a490a9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:15:36.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:36.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:36.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:36.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:36.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:36.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:36.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:36.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:36.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:36.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:36.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:36.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:38.118 134145 DEBUG oslo_service.periodic_task [req-9b60c83b-533d-4145-8964-9780d1ac801f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:15:38.122 134145 DEBUG oslo_concurrency.lockutils [req-d34693eb-8b80-40bb-81cc-97c77344a67b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:15:38.122 134145 DEBUG oslo_concurrency.lockutils [req-d34693eb-8b80-40bb-81cc-97c77344a67b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:15:38.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:38.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:38.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:38.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:38.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:38.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:38.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:38.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:38.835 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:38.835 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:38.835 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:38.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:42.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:42.836 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:42.836 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:42.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:42.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:42.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:42.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:42.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:42.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:42.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:42.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:42.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:50.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:50.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:50.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:50.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:50.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:50.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:50.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:50.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:15:50.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:50.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:15:50.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:50.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:15:56.080 134146 DEBUG oslo_service.periodic_task [req-9f011f5d-c856-4648-a2bb-8665bf8a3e3e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:15:56.085 134146 DEBUG oslo_concurrency.lockutils [req-7011cc7a-88b4-477b-851f-b99c6dfbe809 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:15:56.085 134146 DEBUG oslo_concurrency.lockutils [req-7011cc7a-88b4-477b-851f-b99c6dfbe809 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:16:02.052 134140 DEBUG oslo_service.periodic_task [req-d95f220d-2cde-4930-a427-b70fc55abc34 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:16:02.056 134140 DEBUG oslo_concurrency.lockutils [req-3f19919c-067a-46d1-be20-719d4681d735 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:16:02.057 134140 DEBUG oslo_concurrency.lockutils [req-3f19919c-067a-46d1-be20-719d4681d735 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:16:06.029 134138 DEBUG oslo_service.periodic_task [req-51b30ca8-28e5-4151-9d2e-d0cb73a490a9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:16:06.034 134138 DEBUG oslo_concurrency.lockutils [req-a6ef5e7a-cbfb-4a73-b431-2c2a87db5b96 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:16:06.034 134138 DEBUG oslo_concurrency.lockutils [req-a6ef5e7a-cbfb-4a73-b431-2c2a87db5b96 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:16:06.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:16:06.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:16:06.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:16:06.845 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:16:06.845 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:16:06.845 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:16:06.848 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:16:06.848 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:16:06.848 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:16:06.848 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:16:06.848 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:16:06.849 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:16:09.052 134145 DEBUG oslo_service.periodic_task [req-d34693eb-8b80-40bb-81cc-97c77344a67b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:16:09.056 134145 DEBUG oslo_concurrency.lockutils [req-4fe11e64-d3bb-464e-b757-b11e3470ce21 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:16:09.056 134145 DEBUG oslo_concurrency.lockutils [req-4fe11e64-d3bb-464e-b757-b11e3470ce21 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:16:26.090 134146 DEBUG oslo_service.periodic_task [req-7011cc7a-88b4-477b-851f-b99c6dfbe809 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:16:26.094 134146 DEBUG oslo_concurrency.lockutils [req-794a8c0e-e817-4309-9c9c-fc0a2113d205 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:16:26.094 134146 DEBUG oslo_concurrency.lockutils [req-794a8c0e-e817-4309-9c9c-fc0a2113d205 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:16:33.052 134140 DEBUG oslo_service.periodic_task [req-3f19919c-067a-46d1-be20-719d4681d735 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:16:33.056 134140 DEBUG oslo_concurrency.lockutils [req-2ee86665-a307-4a56-8acc-f1466efc9468 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:16:33.057 134140 DEBUG oslo_concurrency.lockutils [req-2ee86665-a307-4a56-8acc-f1466efc9468 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:16:37.020 134138 DEBUG oslo_service.periodic_task [req-a6ef5e7a-cbfb-4a73-b431-2c2a87db5b96 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:16:37.024 134138 DEBUG oslo_concurrency.lockutils [req-53802203-5ad8-4c62-9187-2f3ca4a5b204 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:16:37.024 134138 DEBUG oslo_concurrency.lockutils [req-53802203-5ad8-4c62-9187-2f3ca4a5b204 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:16:38.846 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:16:38.847 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:16:38.847 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:16:38.850 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:16:38.850 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:16:38.850 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:16:38.851 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:16:38.851 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:16:38.852 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:16:38.854 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:16:38.854 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:16:38.854 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:16:39.061 134145 DEBUG oslo_service.periodic_task [req-4fe11e64-d3bb-464e-b757-b11e3470ce21 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:16:39.065 134145 DEBUG oslo_concurrency.lockutils [req-37b6fb7e-903d-4059-9d8f-4974acd1ab57 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:16:39.065 134145 DEBUG oslo_concurrency.lockutils [req-37b6fb7e-903d-4059-9d8f-4974acd1ab57 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:16:56.100 134146 DEBUG oslo_service.periodic_task [req-794a8c0e-e817-4309-9c9c-fc0a2113d205 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:16:56.104 134146 DEBUG oslo_concurrency.lockutils [req-5f71a7a8-fc03-48c8-a456-6c74f5d1c7b4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:16:56.105 134146 DEBUG oslo_concurrency.lockutils [req-5f71a7a8-fc03-48c8-a456-6c74f5d1c7b4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:17:03.069 134140 DEBUG oslo_service.periodic_task [req-2ee86665-a307-4a56-8acc-f1466efc9468 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:17:03.073 134140 DEBUG oslo_concurrency.lockutils [req-dd263aa4-56aa-4741-b4ce-c38be3abb1bd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:17:03.074 134140 DEBUG oslo_concurrency.lockutils [req-dd263aa4-56aa-4741-b4ce-c38be3abb1bd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:17:07.033 134138 DEBUG oslo_service.periodic_task [req-53802203-5ad8-4c62-9187-2f3ca4a5b204 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:17:07.039 134138 DEBUG oslo_concurrency.lockutils [req-3bccf3ec-635c-4fef-9630-d235b673d28c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:17:07.039 134138 DEBUG oslo_concurrency.lockutils [req-3bccf3ec-635c-4fef-9630-d235b673d28c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:17:09.074 134145 DEBUG oslo_service.periodic_task [req-37b6fb7e-903d-4059-9d8f-4974acd1ab57 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:17:09.079 134145 DEBUG oslo_concurrency.lockutils [req-e32f94ad-e1f9-45db-80dd-19b7aac33b3b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:17:09.079 134145 DEBUG oslo_concurrency.lockutils [req-e32f94ad-e1f9-45db-80dd-19b7aac33b3b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:17:26.110 134146 DEBUG oslo_service.periodic_task [req-5f71a7a8-fc03-48c8-a456-6c74f5d1c7b4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:17:26.114 134146 DEBUG oslo_concurrency.lockutils [req-8a40e510-d5a0-4c58-b0be-74b24fb669f5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:17:26.114 134146 DEBUG oslo_concurrency.lockutils [req-8a40e510-d5a0-4c58-b0be-74b24fb669f5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:17:33.086 134140 DEBUG oslo_service.periodic_task [req-dd263aa4-56aa-4741-b4ce-c38be3abb1bd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:17:33.091 134140 DEBUG oslo_concurrency.lockutils [req-5b6c6181-f078-4f59-8677-681570d1e607 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:17:33.091 134140 DEBUG oslo_concurrency.lockutils [req-5b6c6181-f078-4f59-8677-681570d1e607 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:17:36.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 710e2c5ce5b84ff7b3ab7a489db5408b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:17:36.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 710e2c5ce5b84ff7b3ab7a489db5408b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:17:36.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:36.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 710e2c5ce5b84ff7b3ab7a489db5408b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:17:36.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 710e2c5ce5b84ff7b3ab7a489db5408b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:17:36.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:36.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:36.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 710e2c5ce5b84ff7b3ab7a489db5408b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:17:36.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:36.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 710e2c5ce5b84ff7b3ab7a489db5408b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:17:36.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:36.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 710e2c5ce5b84ff7b3ab7a489db5408b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:17:36.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:36.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 710e2c5ce5b84ff7b3ab7a489db5408b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:17:36.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:36.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:36.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:36.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:36.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:36.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:36.828 134140 DEBUG oslo_concurrency.lockutils [req-a14cbd85-9d38-43e9-9ca0-477f2b4b43a8 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:17:36.828 134140 DEBUG nova.scheduler.host_manager [req-a14cbd85-9d38-43e9-9ca0-477f2b4b43a8 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:17:36.828 134140 DEBUG oslo_concurrency.lockutils [req-a14cbd85-9d38-43e9-9ca0-477f2b4b43a8 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:17:36.828 134145 DEBUG oslo_concurrency.lockutils [req-a14cbd85-9d38-43e9-9ca0-477f2b4b43a8 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:17:36.828 134146 DEBUG oslo_concurrency.lockutils [req-a14cbd85-9d38-43e9-9ca0-477f2b4b43a8 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:17:36.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:36.829 134138 DEBUG oslo_concurrency.lockutils [req-a14cbd85-9d38-43e9-9ca0-477f2b4b43a8 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:17:36.829 134146 DEBUG nova.scheduler.host_manager [req-a14cbd85-9d38-43e9-9ca0-477f2b4b43a8 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:17:36.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:36.829 134145 DEBUG nova.scheduler.host_manager [req-a14cbd85-9d38-43e9-9ca0-477f2b4b43a8 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:17:36.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:36.829 134146 DEBUG oslo_concurrency.lockutils [req-a14cbd85-9d38-43e9-9ca0-477f2b4b43a8 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:17:36.829 134138 DEBUG nova.scheduler.host_manager [req-a14cbd85-9d38-43e9-9ca0-477f2b4b43a8 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:17:36.829 134145 DEBUG oslo_concurrency.lockutils [req-a14cbd85-9d38-43e9-9ca0-477f2b4b43a8 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:17:36.829 134138 DEBUG oslo_concurrency.lockutils [req-a14cbd85-9d38-43e9-9ca0-477f2b4b43a8 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:17:36.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:36.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:36.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:36.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:36.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:36.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:36.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:36.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:36.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:37.044 134138 DEBUG oslo_service.periodic_task [req-3bccf3ec-635c-4fef-9630-d235b673d28c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:17:37.049 134138 DEBUG oslo_concurrency.lockutils [req-2f53eeef-5c5d-4345-b24d-06171ec7218f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:17:37.049 134138 DEBUG oslo_concurrency.lockutils [req-2f53eeef-5c5d-4345-b24d-06171ec7218f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:17:37.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:37.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:37.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:37.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:37.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:37.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:37.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:37.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:37.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:37.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:37.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:37.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:39.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:39.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:39.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:39.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:39.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:39.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:39.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:39.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:39.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:39.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:39.835 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:39.835 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:40.051 134145 DEBUG oslo_service.periodic_task [req-e32f94ad-e1f9-45db-80dd-19b7aac33b3b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:17:40.057 134145 DEBUG oslo_concurrency.lockutils [req-690b1c9b-a03c-414a-a7f7-d5c8bc212ac3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:17:40.057 134145 DEBUG oslo_concurrency.lockutils [req-690b1c9b-a03c-414a-a7f7-d5c8bc212ac3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:17:43.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:43.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:43.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:43.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:43.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:43.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:43.836 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:43.836 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:43.836 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:43.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:43.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:43.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:51.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:51.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:51.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:51.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:51.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:51.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:51.843 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:51.843 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:51.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:51.845 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:17:51.845 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:17:51.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:17:56.120 134146 DEBUG oslo_service.periodic_task [req-8a40e510-d5a0-4c58-b0be-74b24fb669f5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:17:56.124 134146 DEBUG oslo_concurrency.lockutils [req-e48f8091-0c0f-4c7e-8520-070218de8d54 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:17:56.125 134146 DEBUG oslo_concurrency.lockutils [req-e48f8091-0c0f-4c7e-8520-070218de8d54 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:18:03.102 134140 DEBUG oslo_service.periodic_task [req-5b6c6181-f078-4f59-8677-681570d1e607 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:18:03.107 134140 DEBUG oslo_concurrency.lockutils [req-3b7bb422-8bc8-4733-a3a9-29844b5169ac - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:18:03.107 134140 DEBUG oslo_concurrency.lockutils [req-3b7bb422-8bc8-4733-a3a9-29844b5169ac - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:18:07.055 134138 DEBUG oslo_service.periodic_task [req-2f53eeef-5c5d-4345-b24d-06171ec7218f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:18:07.059 134138 DEBUG oslo_concurrency.lockutils [req-7b04f14f-1006-4feb-959a-64931a8faabd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:18:07.059 134138 DEBUG oslo_concurrency.lockutils [req-7b04f14f-1006-4feb-959a-64931a8faabd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:18:07.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:18:07.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:18:07.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:18:07.845 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:18:07.845 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:18:07.845 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:18:07.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:18:07.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:18:07.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:18:07.847 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:18:07.847 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:18:07.848 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:18:10.063 134145 DEBUG oslo_service.periodic_task [req-690b1c9b-a03c-414a-a7f7-d5c8bc212ac3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:18:10.067 134145 DEBUG oslo_concurrency.lockutils [req-450ea2b4-2489-4040-a387-3017bca66859 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:18:10.067 134145 DEBUG oslo_concurrency.lockutils [req-450ea2b4-2489-4040-a387-3017bca66859 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:18:26.130 134146 DEBUG oslo_service.periodic_task [req-e48f8091-0c0f-4c7e-8520-070218de8d54 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:18:26.136 134146 DEBUG oslo_concurrency.lockutils [req-028ac581-166e-4e97-8e15-42b1d843bbaa - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:18:26.136 134146 DEBUG oslo_concurrency.lockutils [req-028ac581-166e-4e97-8e15-42b1d843bbaa - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:18:33.112 134140 DEBUG oslo_service.periodic_task [req-3b7bb422-8bc8-4733-a3a9-29844b5169ac - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:18:33.116 134140 DEBUG oslo_concurrency.lockutils [req-0e7f4c46-e061-44de-864f-4448d3f4ba1d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:18:33.116 134140 DEBUG oslo_concurrency.lockutils [req-0e7f4c46-e061-44de-864f-4448d3f4ba1d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:18:37.066 134138 DEBUG oslo_service.periodic_task [req-7b04f14f-1006-4feb-959a-64931a8faabd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:18:37.071 134138 DEBUG oslo_concurrency.lockutils [req-8344befc-f475-450f-8149-233a08521e5c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:18:37.072 134138 DEBUG oslo_concurrency.lockutils [req-8344befc-f475-450f-8149-233a08521e5c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:18:39.846 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:18:39.846 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:18:39.846 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:18:39.851 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:18:39.851 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:18:39.851 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:18:39.851 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:18:39.852 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:18:39.852 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:18:39.852 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:18:39.852 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:18:39.852 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:18:41.052 134145 DEBUG oslo_service.periodic_task [req-450ea2b4-2489-4040-a387-3017bca66859 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:18:41.056 134145 DEBUG oslo_concurrency.lockutils [req-3b118a7e-eed9-4066-bf86-5901a7195e75 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:18:41.056 134145 DEBUG oslo_concurrency.lockutils [req-3b118a7e-eed9-4066-bf86-5901a7195e75 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:18:57.069 134146 DEBUG oslo_service.periodic_task [req-028ac581-166e-4e97-8e15-42b1d843bbaa - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:18:57.074 134146 DEBUG oslo_concurrency.lockutils [req-d2ac1f60-a6a8-4326-816e-d8b3be0e8066 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:18:57.074 134146 DEBUG oslo_concurrency.lockutils [req-d2ac1f60-a6a8-4326-816e-d8b3be0e8066 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:19:03.121 134140 DEBUG oslo_service.periodic_task [req-0e7f4c46-e061-44de-864f-4448d3f4ba1d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:19:03.126 134140 DEBUG oslo_concurrency.lockutils [req-878fa7ea-7ebc-490f-bc53-a4b0fd75b518 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:19:03.126 134140 DEBUG oslo_concurrency.lockutils [req-878fa7ea-7ebc-490f-bc53-a4b0fd75b518 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:19:07.076 134138 DEBUG oslo_service.periodic_task [req-8344befc-f475-450f-8149-233a08521e5c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:19:07.080 134138 DEBUG oslo_concurrency.lockutils [req-f77c05cb-a135-4911-851f-1d7823d3766b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:19:07.081 134138 DEBUG oslo_concurrency.lockutils [req-f77c05cb-a135-4911-851f-1d7823d3766b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:19:11.065 134145 DEBUG oslo_service.periodic_task [req-3b118a7e-eed9-4066-bf86-5901a7195e75 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:19:11.070 134145 DEBUG oslo_concurrency.lockutils [req-985c89d8-8f59-48ae-9877-60645e85bed1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:19:11.070 134145 DEBUG oslo_concurrency.lockutils [req-985c89d8-8f59-48ae-9877-60645e85bed1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:19:27.079 134146 DEBUG oslo_service.periodic_task [req-d2ac1f60-a6a8-4326-816e-d8b3be0e8066 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:19:27.084 134146 DEBUG oslo_concurrency.lockutils [req-c00e0458-7878-4068-ae39-94e7d18fbc4b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:19:27.084 134146 DEBUG oslo_concurrency.lockutils [req-c00e0458-7878-4068-ae39-94e7d18fbc4b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:19:33.131 134140 DEBUG oslo_service.periodic_task [req-878fa7ea-7ebc-490f-bc53-a4b0fd75b518 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:19:33.135 134140 DEBUG oslo_concurrency.lockutils [req-1aefd01e-69b7-4ab2-896e-c88697abb40a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:19:33.135 134140 DEBUG oslo_concurrency.lockutils [req-1aefd01e-69b7-4ab2-896e-c88697abb40a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:19:38.019 134138 DEBUG oslo_service.periodic_task [req-f77c05cb-a135-4911-851f-1d7823d3766b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:19:38.024 134138 DEBUG oslo_concurrency.lockutils [req-990f7908-b74b-4164-8561-05d1a47829f6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:19:38.024 134138 DEBUG oslo_concurrency.lockutils [req-990f7908-b74b-4164-8561-05d1a47829f6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:19:39.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 491bc292b627400f90b5e5b4d180c738 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:19:39.825 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 491bc292b627400f90b5e5b4d180c738 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:19:39.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:39.825 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:39.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 491bc292b627400f90b5e5b4d180c738 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:19:39.825 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 491bc292b627400f90b5e5b4d180c738 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:19:39.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:39.825 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:39.825 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 491bc292b627400f90b5e5b4d180c738 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:19:39.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:39.825 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:39.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:39.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 491bc292b627400f90b5e5b4d180c738 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:19:39.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:39.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:39.826 134145 DEBUG oslo_concurrency.lockutils [req-5c22f41c-9d80-4511-9b52-79fe5546b36b - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:19:39.826 134145 DEBUG nova.scheduler.host_manager [req-5c22f41c-9d80-4511-9b52-79fe5546b36b - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:19:39.826 134146 DEBUG oslo_concurrency.lockutils [req-5c22f41c-9d80-4511-9b52-79fe5546b36b - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:19:39.826 134145 DEBUG oslo_concurrency.lockutils [req-5c22f41c-9d80-4511-9b52-79fe5546b36b - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:19:39.826 134146 DEBUG nova.scheduler.host_manager [req-5c22f41c-9d80-4511-9b52-79fe5546b36b - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:19:39.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 491bc292b627400f90b5e5b4d180c738 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:19:39.827 134146 DEBUG oslo_concurrency.lockutils [req-5c22f41c-9d80-4511-9b52-79fe5546b36b - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:19:39.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:39.827 134140 DEBUG oslo_concurrency.lockutils [req-5c22f41c-9d80-4511-9b52-79fe5546b36b - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:19:39.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:39.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:39.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:39.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 491bc292b627400f90b5e5b4d180c738 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:19:39.827 134140 DEBUG nova.scheduler.host_manager [req-5c22f41c-9d80-4511-9b52-79fe5546b36b - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:19:39.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:39.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:39.827 134140 DEBUG oslo_concurrency.lockutils [req-5c22f41c-9d80-4511-9b52-79fe5546b36b - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:19:39.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:39.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:39.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:39.828 134138 DEBUG oslo_concurrency.lockutils [req-5c22f41c-9d80-4511-9b52-79fe5546b36b - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:19:39.828 134138 DEBUG nova.scheduler.host_manager [req-5c22f41c-9d80-4511-9b52-79fe5546b36b - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:19:39.828 134138 DEBUG oslo_concurrency.lockutils [req-5c22f41c-9d80-4511-9b52-79fe5546b36b - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:19:39.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:39.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:39.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:39.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:39.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:39.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:40.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:40.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:40.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:40.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:40.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:40.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:40.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:40.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:40.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:40.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:40.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:40.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:41.074 134145 DEBUG oslo_service.periodic_task [req-985c89d8-8f59-48ae-9877-60645e85bed1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:19:41.079 134145 DEBUG oslo_concurrency.lockutils [req-739a4385-3b40-4896-ac6b-39aee86801c7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:19:41.080 134145 DEBUG oslo_concurrency.lockutils [req-739a4385-3b40-4896-ac6b-39aee86801c7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:19:42.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:42.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:42.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:42.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:42.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:42.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:42.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:42.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:42.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:42.833 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:42.833 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:42.833 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:46.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:46.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:46.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:46.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:46.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:46.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:46.835 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:46.835 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:46.835 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:46.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:46.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:46.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:54.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:54.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:54.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:54.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:54.836 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:54.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:54.836 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:54.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:54.836 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:54.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:19:54.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:19:54.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:19:57.091 134146 DEBUG oslo_service.periodic_task [req-c00e0458-7878-4068-ae39-94e7d18fbc4b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:19:57.095 134146 DEBUG oslo_concurrency.lockutils [req-5714031f-3729-4658-89ea-9250ca0341c1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:19:57.096 134146 DEBUG oslo_concurrency.lockutils [req-5714031f-3729-4658-89ea-9250ca0341c1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:20:03.150 134140 DEBUG oslo_service.periodic_task [req-1aefd01e-69b7-4ab2-896e-c88697abb40a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:20:03.154 134140 DEBUG oslo_concurrency.lockutils [req-f21b874c-466f-403d-9873-189da5bacb5e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:20:03.154 134140 DEBUG oslo_concurrency.lockutils [req-f21b874c-466f-403d-9873-189da5bacb5e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:20:09.020 134138 DEBUG oslo_service.periodic_task [req-990f7908-b74b-4164-8561-05d1a47829f6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:20:09.024 134138 DEBUG oslo_concurrency.lockutils [req-7c908c2b-8f1b-4fab-a3f9-5a2ee0d8921c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:20:09.025 134138 DEBUG oslo_concurrency.lockutils [req-7c908c2b-8f1b-4fab-a3f9-5a2ee0d8921c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:20:10.837 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:20:10.837 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:20:10.838 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:20:10.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:20:10.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:20:10.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:20:10.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:20:10.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:20:10.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:20:10.839 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:20:10.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:20:10.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:20:12.052 134145 DEBUG oslo_service.periodic_task [req-739a4385-3b40-4896-ac6b-39aee86801c7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:20:12.056 134145 DEBUG oslo_concurrency.lockutils [req-407c111e-d975-4490-b1b8-41c49ca03b51 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:20:12.057 134145 DEBUG oslo_concurrency.lockutils [req-407c111e-d975-4490-b1b8-41c49ca03b51 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:20:28.069 134146 DEBUG oslo_service.periodic_task [req-5714031f-3729-4658-89ea-9250ca0341c1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:20:28.073 134146 DEBUG oslo_concurrency.lockutils [req-474ad162-5624-420b-8f1a-dd255fd39085 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:20:28.073 134146 DEBUG oslo_concurrency.lockutils [req-474ad162-5624-420b-8f1a-dd255fd39085 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:20:33.159 134140 DEBUG oslo_service.periodic_task [req-f21b874c-466f-403d-9873-189da5bacb5e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:20:33.163 134140 DEBUG oslo_concurrency.lockutils [req-ce4af00b-83fb-40e1-9701-b26cb2463f19 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:20:33.163 134140 DEBUG oslo_concurrency.lockutils [req-ce4af00b-83fb-40e1-9701-b26cb2463f19 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:20:39.034 134138 DEBUG oslo_service.periodic_task [req-7c908c2b-8f1b-4fab-a3f9-5a2ee0d8921c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:20:39.038 134138 DEBUG oslo_concurrency.lockutils [req-4c22e46b-5e28-4275-aaad-4de713cf55d4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:20:39.039 134138 DEBUG oslo_concurrency.lockutils [req-4c22e46b-5e28-4275-aaad-4de713cf55d4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:20:42.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:20:42.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:20:42.841 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:20:42.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:20:42.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:20:42.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:20:42.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:20:42.847 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:20:42.847 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:20:42.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:20:42.847 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:20:42.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:20:43.052 134145 DEBUG oslo_service.periodic_task [req-407c111e-d975-4490-b1b8-41c49ca03b51 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:20:43.056 134145 DEBUG oslo_concurrency.lockutils [req-54214cb7-76ee-4b5b-8177-abf6dfe59195 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:20:43.056 134145 DEBUG oslo_concurrency.lockutils [req-54214cb7-76ee-4b5b-8177-abf6dfe59195 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:20:58.078 134146 DEBUG oslo_service.periodic_task [req-474ad162-5624-420b-8f1a-dd255fd39085 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:20:58.083 134146 DEBUG oslo_concurrency.lockutils [req-2e791378-a39f-4223-b62a-95362de9e2b2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:20:58.083 134146 DEBUG oslo_concurrency.lockutils [req-2e791378-a39f-4223-b62a-95362de9e2b2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:21:03.167 134140 DEBUG oslo_service.periodic_task [req-ce4af00b-83fb-40e1-9701-b26cb2463f19 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:21:03.172 134140 DEBUG oslo_concurrency.lockutils [req-dc65eeab-567c-446d-96e5-d62a8b9ae3f5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:21:03.172 134140 DEBUG oslo_concurrency.lockutils [req-dc65eeab-567c-446d-96e5-d62a8b9ae3f5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:21:09.047 134138 DEBUG oslo_service.periodic_task [req-4c22e46b-5e28-4275-aaad-4de713cf55d4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:21:09.051 134138 DEBUG oslo_concurrency.lockutils [req-af34a2cf-1157-46d1-a304-a9d2c80d8d38 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:21:09.051 134138 DEBUG oslo_concurrency.lockutils [req-af34a2cf-1157-46d1-a304-a9d2c80d8d38 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:21:14.052 134145 DEBUG oslo_service.periodic_task [req-54214cb7-76ee-4b5b-8177-abf6dfe59195 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:21:14.056 134145 DEBUG oslo_concurrency.lockutils [req-b1b30c0e-44bf-40eb-83c6-0edc5e512fb4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:21:14.057 134145 DEBUG oslo_concurrency.lockutils [req-b1b30c0e-44bf-40eb-83c6-0edc5e512fb4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:21:29.069 134146 DEBUG oslo_service.periodic_task [req-2e791378-a39f-4223-b62a-95362de9e2b2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:21:29.073 134146 DEBUG oslo_concurrency.lockutils [req-a7e21470-32df-482b-87a8-6f3928d3f310 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:21:29.074 134146 DEBUG oslo_concurrency.lockutils [req-a7e21470-32df-482b-87a8-6f3928d3f310 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:21:33.177 134140 DEBUG oslo_service.periodic_task [req-dc65eeab-567c-446d-96e5-d62a8b9ae3f5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:21:33.181 134140 DEBUG oslo_concurrency.lockutils [req-743ea3cc-cbe9-424b-862a-a873593afcfd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:21:33.182 134140 DEBUG oslo_concurrency.lockutils [req-743ea3cc-cbe9-424b-862a-a873593afcfd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:21:39.061 134138 DEBUG oslo_service.periodic_task [req-af34a2cf-1157-46d1-a304-a9d2c80d8d38 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:21:39.064 134138 DEBUG oslo_concurrency.lockutils [req-065ee469-b910-45c1-a495-58f9103f060f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:21:39.065 134138 DEBUG oslo_concurrency.lockutils [req-065ee469-b910-45c1-a495-58f9103f060f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:21:42.824 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3ee5105d10a8454581999a6042981873 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:21:42.824 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3ee5105d10a8454581999a6042981873 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:21:42.824 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3ee5105d10a8454581999a6042981873 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:21:42.824 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3ee5105d10a8454581999a6042981873 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:21:42.825 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:42.825 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:42.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:42.825 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3ee5105d10a8454581999a6042981873 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:21:42.825 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:42.825 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3ee5105d10a8454581999a6042981873 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:21:42.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3ee5105d10a8454581999a6042981873 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:21:42.825 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3ee5105d10a8454581999a6042981873 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:21:42.825 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:42.825 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:42.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:42.825 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:42.825 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:42.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:42.825 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:42.825 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:42.826 134138 DEBUG oslo_concurrency.lockutils [req-34e42d48-c109-45d2-a214-e8a3ec45b32c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:21:42.826 134145 DEBUG oslo_concurrency.lockutils [req-34e42d48-c109-45d2-a214-e8a3ec45b32c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:21:42.826 134146 DEBUG oslo_concurrency.lockutils [req-34e42d48-c109-45d2-a214-e8a3ec45b32c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:21:42.826 134140 DEBUG oslo_concurrency.lockutils [req-34e42d48-c109-45d2-a214-e8a3ec45b32c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:21:42.826 134138 DEBUG nova.scheduler.host_manager [req-34e42d48-c109-45d2-a214-e8a3ec45b32c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:21:42.826 134145 DEBUG nova.scheduler.host_manager [req-34e42d48-c109-45d2-a214-e8a3ec45b32c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:21:42.826 134146 DEBUG nova.scheduler.host_manager [req-34e42d48-c109-45d2-a214-e8a3ec45b32c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:21:42.826 134140 DEBUG nova.scheduler.host_manager [req-34e42d48-c109-45d2-a214-e8a3ec45b32c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:21:42.826 134138 DEBUG oslo_concurrency.lockutils [req-34e42d48-c109-45d2-a214-e8a3ec45b32c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:21:42.826 134145 DEBUG oslo_concurrency.lockutils [req-34e42d48-c109-45d2-a214-e8a3ec45b32c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:21:42.826 134146 DEBUG oslo_concurrency.lockutils [req-34e42d48-c109-45d2-a214-e8a3ec45b32c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:21:42.826 134140 DEBUG oslo_concurrency.lockutils [req-34e42d48-c109-45d2-a214-e8a3ec45b32c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:21:42.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:42.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:42.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:42.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:42.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:42.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:42.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:42.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:42.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:42.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:42.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:42.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:43.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:43.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:43.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:43.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:43.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:43.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:43.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:43.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:43.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:43.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:43.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:43.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:44.062 134145 DEBUG oslo_service.periodic_task [req-b1b30c0e-44bf-40eb-83c6-0edc5e512fb4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:21:44.066 134145 DEBUG oslo_concurrency.lockutils [req-dfecc433-0d27-4191-86ec-d7a03c3408e2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:21:44.066 134145 DEBUG oslo_concurrency.lockutils [req-dfecc433-0d27-4191-86ec-d7a03c3408e2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:21:45.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:45.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:45.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:45.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:45.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:45.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:45.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:45.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:45.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:45.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:45.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:45.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:49.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:49.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:49.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:49.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:49.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:49.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:49.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:49.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:49.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:49.836 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:49.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:49.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:57.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:57.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:57.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:57.838 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:57.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:57.838 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:57.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:57.838 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:57.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:57.841 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:21:57.841 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:21:57.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:21:59.079 134146 DEBUG oslo_service.periodic_task [req-a7e21470-32df-482b-87a8-6f3928d3f310 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:21:59.083 134146 DEBUG oslo_concurrency.lockutils [req-1ccae27c-a69b-401e-9da7-2770328108b5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:21:59.084 134146 DEBUG oslo_concurrency.lockutils [req-1ccae27c-a69b-401e-9da7-2770328108b5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:22:03.190 134140 DEBUG oslo_service.periodic_task [req-743ea3cc-cbe9-424b-862a-a873593afcfd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:22:03.194 134140 DEBUG oslo_concurrency.lockutils [req-736b566d-6ea9-4aa9-b1b7-92fa77a94fb3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:22:03.194 134140 DEBUG oslo_concurrency.lockutils [req-736b566d-6ea9-4aa9-b1b7-92fa77a94fb3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:22:09.074 134138 DEBUG oslo_service.periodic_task [req-065ee469-b910-45c1-a495-58f9103f060f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:22:09.078 134138 DEBUG oslo_concurrency.lockutils [req-20905793-9af8-4c7d-aa6b-170a5797c649 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:22:09.078 134138 DEBUG oslo_concurrency.lockutils [req-20905793-9af8-4c7d-aa6b-170a5797c649 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:22:13.839 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:22:13.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:22:13.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:22:13.840 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:22:13.840 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:22:13.840 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:22:13.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:22:13.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:22:13.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:22:13.843 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:22:13.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:22:13.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:22:15.052 134145 DEBUG oslo_service.periodic_task [req-dfecc433-0d27-4191-86ec-d7a03c3408e2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:22:15.056 134145 DEBUG oslo_concurrency.lockutils [req-589f3809-246b-49e1-b373-5e7000f3548d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:22:15.056 134145 DEBUG oslo_concurrency.lockutils [req-589f3809-246b-49e1-b373-5e7000f3548d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:22:29.095 134146 DEBUG oslo_service.periodic_task [req-1ccae27c-a69b-401e-9da7-2770328108b5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:22:29.100 134146 DEBUG oslo_concurrency.lockutils [req-62a38060-7d13-4748-9bd8-57800b9c589e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:22:29.100 134146 DEBUG oslo_concurrency.lockutils [req-62a38060-7d13-4748-9bd8-57800b9c589e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:22:33.200 134140 DEBUG oslo_service.periodic_task [req-736b566d-6ea9-4aa9-b1b7-92fa77a94fb3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:22:33.204 134140 DEBUG oslo_concurrency.lockutils [req-d9c6aa29-5695-4fc3-8f4a-827228f9927c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:22:33.204 134140 DEBUG oslo_concurrency.lockutils [req-d9c6aa29-5695-4fc3-8f4a-827228f9927c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:22:39.088 134138 DEBUG oslo_service.periodic_task [req-20905793-9af8-4c7d-aa6b-170a5797c649 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:22:39.092 134138 DEBUG oslo_concurrency.lockutils [req-44da0ba7-4de1-419b-9332-94c763c804ba - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:22:39.092 134138 DEBUG oslo_concurrency.lockutils [req-44da0ba7-4de1-419b-9332-94c763c804ba - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:22:45.060 134145 DEBUG oslo_service.periodic_task [req-589f3809-246b-49e1-b373-5e7000f3548d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:22:45.064 134145 DEBUG oslo_concurrency.lockutils [req-55f372ae-4e38-4afb-b0c0-9949dc518d42 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:22:45.065 134145 DEBUG oslo_concurrency.lockutils [req-55f372ae-4e38-4afb-b0c0-9949dc518d42 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:22:45.841 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:22:45.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:22:45.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:22:45.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:22:45.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:22:45.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:22:45.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:22:45.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:22:45.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:22:45.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:22:45.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:22:45.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:00.069 134146 DEBUG oslo_service.periodic_task [req-62a38060-7d13-4748-9bd8-57800b9c589e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:23:00.074 134146 DEBUG oslo_concurrency.lockutils [req-518bd15f-9e75-42ce-ac92-29318f434f57 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:23:00.074 134146 DEBUG oslo_concurrency.lockutils [req-518bd15f-9e75-42ce-ac92-29318f434f57 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:23:03.209 134140 DEBUG oslo_service.periodic_task [req-d9c6aa29-5695-4fc3-8f4a-827228f9927c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:23:03.215 134140 DEBUG oslo_concurrency.lockutils [req-c47f844a-1ffb-42aa-89cb-31662d749d16 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:23:03.215 134140 DEBUG oslo_concurrency.lockutils [req-c47f844a-1ffb-42aa-89cb-31662d749d16 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:23:09.100 134138 DEBUG oslo_service.periodic_task [req-44da0ba7-4de1-419b-9332-94c763c804ba - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:23:09.104 134138 DEBUG oslo_concurrency.lockutils [req-c3dce3d7-1c2b-45e6-83ab-30c3b6b639d7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:23:09.104 134138 DEBUG oslo_concurrency.lockutils [req-c3dce3d7-1c2b-45e6-83ab-30c3b6b639d7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:23:15.070 134145 DEBUG oslo_service.periodic_task [req-55f372ae-4e38-4afb-b0c0-9949dc518d42 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:23:15.074 134145 DEBUG oslo_concurrency.lockutils [req-08734ffb-7234-4b2c-a3b5-899d5ad11d7b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:23:15.074 134145 DEBUG oslo_concurrency.lockutils [req-08734ffb-7234-4b2c-a3b5-899d5ad11d7b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:23:30.083 134146 DEBUG oslo_service.periodic_task [req-518bd15f-9e75-42ce-ac92-29318f434f57 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:23:30.087 134146 DEBUG oslo_concurrency.lockutils [req-f34b2f5c-d9a3-48d8-928e-36914837a4e6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:23:30.088 134146 DEBUG oslo_concurrency.lockutils [req-f34b2f5c-d9a3-48d8-928e-36914837a4e6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:23:34.052 134140 DEBUG oslo_service.periodic_task [req-c47f844a-1ffb-42aa-89cb-31662d749d16 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:23:34.057 134140 DEBUG oslo_concurrency.lockutils [req-9fb12051-c9fe-49c3-8011-17af44c9b342 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:23:34.057 134140 DEBUG oslo_concurrency.lockutils [req-9fb12051-c9fe-49c3-8011-17af44c9b342 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:23:40.020 134138 DEBUG oslo_service.periodic_task [req-c3dce3d7-1c2b-45e6-83ab-30c3b6b639d7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:23:40.027 134138 DEBUG oslo_concurrency.lockutils [req-8d2351cd-7db3-4fdc-8ba0-518e6c01cdf3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:23:40.027 134138 DEBUG oslo_concurrency.lockutils [req-8d2351cd-7db3-4fdc-8ba0-518e6c01cdf3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:23:44.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2518e966d139499e929c7fd22de93d46 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:23:44.838 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2518e966d139499e929c7fd22de93d46 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:23:44.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2518e966d139499e929c7fd22de93d46 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:23:44.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2518e966d139499e929c7fd22de93d46 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:23:44.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:44.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2518e966d139499e929c7fd22de93d46 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:23:44.839 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:44.839 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:44.839 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2518e966d139499e929c7fd22de93d46 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:23:44.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:44.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:44.839 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2518e966d139499e929c7fd22de93d46 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:23:44.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:44.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2518e966d139499e929c7fd22de93d46 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:23:44.839 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:44.839 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:44.839 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:44.839 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:44.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:44.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:44.839 134145 DEBUG oslo_concurrency.lockutils [req-237ea311-7518-4677-b222-68937eac023a - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:23:44.840 134145 DEBUG nova.scheduler.host_manager [req-237ea311-7518-4677-b222-68937eac023a - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:23:44.840 134146 DEBUG oslo_concurrency.lockutils [req-237ea311-7518-4677-b222-68937eac023a - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:23:44.840 134138 DEBUG oslo_concurrency.lockutils [req-237ea311-7518-4677-b222-68937eac023a - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:23:44.840 134145 DEBUG oslo_concurrency.lockutils [req-237ea311-7518-4677-b222-68937eac023a - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:23:44.840 134138 DEBUG nova.scheduler.host_manager [req-237ea311-7518-4677-b222-68937eac023a - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:23:44.840 134146 DEBUG nova.scheduler.host_manager [req-237ea311-7518-4677-b222-68937eac023a - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:23:44.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:44.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:44.840 134138 DEBUG oslo_concurrency.lockutils [req-237ea311-7518-4677-b222-68937eac023a - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:23:44.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:44.840 134140 DEBUG oslo_concurrency.lockutils [req-237ea311-7518-4677-b222-68937eac023a - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:23:44.840 134146 DEBUG oslo_concurrency.lockutils [req-237ea311-7518-4677-b222-68937eac023a - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:23:44.841 134140 DEBUG nova.scheduler.host_manager [req-237ea311-7518-4677-b222-68937eac023a - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:23:44.841 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:44.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:44.841 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:44.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:44.841 134140 DEBUG oslo_concurrency.lockutils [req-237ea311-7518-4677-b222-68937eac023a - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:23:44.841 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:44.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:44.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:44.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:44.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:45.079 134145 DEBUG oslo_service.periodic_task [req-08734ffb-7234-4b2c-a3b5-899d5ad11d7b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:23:45.083 134145 DEBUG oslo_concurrency.lockutils [req-0072e192-2ce6-454e-bfc9-5f52e4614cd2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:23:45.084 134145 DEBUG oslo_concurrency.lockutils [req-0072e192-2ce6-454e-bfc9-5f52e4614cd2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:23:45.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:45.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:45.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:45.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:45.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:45.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:45.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:45.843 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:45.843 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:45.844 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:45.844 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:45.844 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:47.844 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:47.844 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:47.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:47.844 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:47.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:47.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:47.845 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:47.845 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:47.845 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:47.847 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:47.847 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:47.847 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:51.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:51.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:51.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:51.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:51.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:51.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:51.849 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:51.849 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:51.850 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:51.850 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:51.850 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:51.851 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:59.850 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:59.850 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:59.851 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:59.851 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:59.851 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:59.851 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:59.852 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:59.852 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:59.852 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:23:59.855 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:23:59.856 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:23:59.856 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:24:00.092 134146 DEBUG oslo_service.periodic_task [req-f34b2f5c-d9a3-48d8-928e-36914837a4e6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:24:00.098 134146 DEBUG oslo_concurrency.lockutils [req-ddbcb3ad-d405-4702-b8d8-28f9b9cae30f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:24:00.098 134146 DEBUG oslo_concurrency.lockutils [req-ddbcb3ad-d405-4702-b8d8-28f9b9cae30f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:24:04.065 134140 DEBUG oslo_service.periodic_task [req-9fb12051-c9fe-49c3-8011-17af44c9b342 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:24:04.069 134140 DEBUG oslo_concurrency.lockutils [req-239f0d6c-a1c0-4483-9f13-482e390e42af - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:24:04.069 134140 DEBUG oslo_concurrency.lockutils [req-239f0d6c-a1c0-4483-9f13-482e390e42af - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:24:10.036 134138 DEBUG oslo_service.periodic_task [req-8d2351cd-7db3-4fdc-8ba0-518e6c01cdf3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:24:10.040 134138 DEBUG oslo_concurrency.lockutils [req-56c7ea22-70f2-4593-8768-81bf313f6b58 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:24:10.040 134138 DEBUG oslo_concurrency.lockutils [req-56c7ea22-70f2-4593-8768-81bf313f6b58 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:24:15.089 134145 DEBUG oslo_service.periodic_task [req-0072e192-2ce6-454e-bfc9-5f52e4614cd2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:24:15.093 134145 DEBUG oslo_concurrency.lockutils [req-de667691-dc27-485e-a015-7be87622ff79 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:24:15.093 134145 DEBUG oslo_concurrency.lockutils [req-de667691-dc27-485e-a015-7be87622ff79 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:24:15.852 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:24:15.853 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:24:15.853 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:24:15.853 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:24:15.853 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:24:15.854 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:24:15.854 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:24:15.854 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:24:15.854 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:24:15.857 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:24:15.857 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:24:15.857 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:24:31.069 134146 DEBUG oslo_service.periodic_task [req-ddbcb3ad-d405-4702-b8d8-28f9b9cae30f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:24:31.074 134146 DEBUG oslo_concurrency.lockutils [req-9029f49a-b035-4231-a4b1-68ee761d9f7d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:24:31.075 134146 DEBUG oslo_concurrency.lockutils [req-9029f49a-b035-4231-a4b1-68ee761d9f7d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:24:34.075 134140 DEBUG oslo_service.periodic_task [req-239f0d6c-a1c0-4483-9f13-482e390e42af - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:24:34.079 134140 DEBUG oslo_concurrency.lockutils [req-5fcbb485-61ab-48ca-9055-b1cd2059b800 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:24:34.080 134140 DEBUG oslo_concurrency.lockutils [req-5fcbb485-61ab-48ca-9055-b1cd2059b800 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:24:40.050 134138 DEBUG oslo_service.periodic_task [req-56c7ea22-70f2-4593-8768-81bf313f6b58 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:24:40.054 134138 DEBUG oslo_concurrency.lockutils [req-68a46261-a7a6-4458-b8d4-ee49339075bf - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:24:40.055 134138 DEBUG oslo_concurrency.lockutils [req-68a46261-a7a6-4458-b8d4-ee49339075bf - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:24:45.097 134145 DEBUG oslo_service.periodic_task [req-de667691-dc27-485e-a015-7be87622ff79 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:24:45.101 134145 DEBUG oslo_concurrency.lockutils [req-23bd2d2c-3806-46cd-bb53-3fefdd88ca12 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:24:45.102 134145 DEBUG oslo_concurrency.lockutils [req-23bd2d2c-3806-46cd-bb53-3fefdd88ca12 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:24:47.857 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:24:47.857 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:24:47.857 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:24:47.857 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:24:47.857 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:24:47.857 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:24:47.857 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:24:47.857 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:24:47.858 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:24:47.859 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:24:47.860 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:24:47.860 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:02.070 134146 DEBUG oslo_service.periodic_task [req-9029f49a-b035-4231-a4b1-68ee761d9f7d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:25:02.074 134146 DEBUG oslo_concurrency.lockutils [req-f18d6aa1-f4d1-409d-a7f9-220045bfc4dc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:25:02.075 134146 DEBUG oslo_concurrency.lockutils [req-f18d6aa1-f4d1-409d-a7f9-220045bfc4dc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:25:04.086 134140 DEBUG oslo_service.periodic_task [req-5fcbb485-61ab-48ca-9055-b1cd2059b800 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:25:04.090 134140 DEBUG oslo_concurrency.lockutils [req-2c6cbecb-45c0-4198-82f4-b28ba0b3c8db - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:25:04.090 134140 DEBUG oslo_concurrency.lockutils [req-2c6cbecb-45c0-4198-82f4-b28ba0b3c8db - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:25:10.064 134138 DEBUG oslo_service.periodic_task [req-68a46261-a7a6-4458-b8d4-ee49339075bf - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:25:10.069 134138 DEBUG oslo_concurrency.lockutils [req-bcd10abd-c153-47f0-af96-62464116cbb7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:25:10.069 134138 DEBUG oslo_concurrency.lockutils [req-bcd10abd-c153-47f0-af96-62464116cbb7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:25:16.052 134145 DEBUG oslo_service.periodic_task [req-23bd2d2c-3806-46cd-bb53-3fefdd88ca12 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:25:16.057 134145 DEBUG oslo_concurrency.lockutils [req-7541cfc2-9850-4ac8-8275-b70c5185f667 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:25:16.057 134145 DEBUG oslo_concurrency.lockutils [req-7541cfc2-9850-4ac8-8275-b70c5185f667 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:25:32.082 134146 DEBUG oslo_service.periodic_task [req-f18d6aa1-f4d1-409d-a7f9-220045bfc4dc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:25:32.086 134146 DEBUG oslo_concurrency.lockutils [req-ce17e831-d560-4ac7-8b43-259e1cd55315 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:25:32.087 134146 DEBUG oslo_concurrency.lockutils [req-ce17e831-d560-4ac7-8b43-259e1cd55315 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:25:34.096 134140 DEBUG oslo_service.periodic_task [req-2c6cbecb-45c0-4198-82f4-b28ba0b3c8db - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:25:34.100 134140 DEBUG oslo_concurrency.lockutils [req-5117f83c-96e1-4724-9b30-2f2572326eac - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:25:34.100 134140 DEBUG oslo_concurrency.lockutils [req-5117f83c-96e1-4724-9b30-2f2572326eac - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:25:40.078 134138 DEBUG oslo_service.periodic_task [req-bcd10abd-c153-47f0-af96-62464116cbb7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:25:40.082 134138 DEBUG oslo_concurrency.lockutils [req-58723095-155a-46dd-a73e-c578e2456ed6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:25:40.083 134138 DEBUG oslo_concurrency.lockutils [req-58723095-155a-46dd-a73e-c578e2456ed6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:25:45.824 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 54fdcaceaac04be09779bfc46590bb9e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:25:45.824 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 54fdcaceaac04be09779bfc46590bb9e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:25:45.824 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 54fdcaceaac04be09779bfc46590bb9e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:25:45.825 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:45.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:45.825 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 54fdcaceaac04be09779bfc46590bb9e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:25:45.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 54fdcaceaac04be09779bfc46590bb9e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:25:45.825 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:45.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:45.825 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:45.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:45.825 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 54fdcaceaac04be09779bfc46590bb9e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:25:45.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:45.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 54fdcaceaac04be09779bfc46590bb9e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:25:45.826 134140 DEBUG oslo_concurrency.lockutils [req-dda510e6-9f00-4156-a06b-773e64682ab6 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:25:45.826 134145 DEBUG oslo_concurrency.lockutils [req-dda510e6-9f00-4156-a06b-773e64682ab6 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:25:45.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:45.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:45.826 134140 DEBUG nova.scheduler.host_manager [req-dda510e6-9f00-4156-a06b-773e64682ab6 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:25:45.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:45.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 54fdcaceaac04be09779bfc46590bb9e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:25:45.826 134145 DEBUG nova.scheduler.host_manager [req-dda510e6-9f00-4156-a06b-773e64682ab6 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:25:45.826 134140 DEBUG oslo_concurrency.lockutils [req-dda510e6-9f00-4156-a06b-773e64682ab6 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:25:45.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:45.826 134145 DEBUG oslo_concurrency.lockutils [req-dda510e6-9f00-4156-a06b-773e64682ab6 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:25:45.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:45.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:25:45.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:25:45.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:45.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:45.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:45.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:45.827 134138 DEBUG oslo_concurrency.lockutils [req-dda510e6-9f00-4156-a06b-773e64682ab6 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:25:45.827 134138 DEBUG nova.scheduler.host_manager [req-dda510e6-9f00-4156-a06b-773e64682ab6 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:25:45.827 134138 DEBUG oslo_concurrency.lockutils [req-dda510e6-9f00-4156-a06b-773e64682ab6 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:25:45.827 134146 DEBUG oslo_concurrency.lockutils [req-dda510e6-9f00-4156-a06b-773e64682ab6 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:25:45.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:25:45.827 134146 DEBUG nova.scheduler.host_manager [req-dda510e6-9f00-4156-a06b-773e64682ab6 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:25:45.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:45.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:45.828 134146 DEBUG oslo_concurrency.lockutils [req-dda510e6-9f00-4156-a06b-773e64682ab6 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:25:45.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:25:45.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:45.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:46.063 134145 DEBUG oslo_service.periodic_task [req-7541cfc2-9850-4ac8-8275-b70c5185f667 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:25:46.067 134145 DEBUG oslo_concurrency.lockutils [req-a37711c9-c1aa-453c-a23e-72df5b3ded52 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:25:46.067 134145 DEBUG oslo_concurrency.lockutils [req-a37711c9-c1aa-453c-a23e-72df5b3ded52 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:25:46.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:25:46.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:25:46.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:46.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:46.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:46.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:46.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:25:46.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:46.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:46.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:25:46.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:46.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:48.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:25:48.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:25:48.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:48.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:48.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:25:48.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:48.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:48.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:48.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:48.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:25:48.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:48.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:52.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:25:52.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:52.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:52.835 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:25:52.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:25:52.835 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:52.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:25:52.836 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:52.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:52.836 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:25:52.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:25:52.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:26:00.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:26:00.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:26:00.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:26:00.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:26:00.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:26:00.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:26:00.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:26:00.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:26:00.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:26:00.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:26:00.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:26:00.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:26:03.069 134146 DEBUG oslo_service.periodic_task [req-ce17e831-d560-4ac7-8b43-259e1cd55315 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:26:03.074 134146 DEBUG oslo_concurrency.lockutils [req-3c888ebd-cf0e-4d83-a741-f1a885750e32 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:26:03.074 134146 DEBUG oslo_concurrency.lockutils [req-3c888ebd-cf0e-4d83-a741-f1a885750e32 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:26:04.107 134140 DEBUG oslo_service.periodic_task [req-5117f83c-96e1-4724-9b30-2f2572326eac - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:26:04.111 134140 DEBUG oslo_concurrency.lockutils [req-49686cfb-4a72-4d57-ad3d-312d129d548f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:26:04.111 134140 DEBUG oslo_concurrency.lockutils [req-49686cfb-4a72-4d57-ad3d-312d129d548f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:26:10.094 134138 DEBUG oslo_service.periodic_task [req-58723095-155a-46dd-a73e-c578e2456ed6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:26:10.098 134138 DEBUG oslo_concurrency.lockutils [req-debaeace-49f0-4da3-8715-b9a33ba334cc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:26:10.098 134138 DEBUG oslo_concurrency.lockutils [req-debaeace-49f0-4da3-8715-b9a33ba334cc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:26:16.072 134145 DEBUG oslo_service.periodic_task [req-a37711c9-c1aa-453c-a23e-72df5b3ded52 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:26:16.076 134145 DEBUG oslo_concurrency.lockutils [req-dfa9dc34-0846-4ee4-9bc2-7f72d544faa1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:26:16.077 134145 DEBUG oslo_concurrency.lockutils [req-dfa9dc34-0846-4ee4-9bc2-7f72d544faa1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:26:16.841 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:26:16.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:26:16.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:26:16.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:26:16.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:26:16.844 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:26:16.843 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:26:16.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:26:16.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:26:16.845 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:26:16.845 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:26:16.846 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:26:33.081 134146 DEBUG oslo_service.periodic_task [req-3c888ebd-cf0e-4d83-a741-f1a885750e32 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:26:33.085 134146 DEBUG oslo_concurrency.lockutils [req-b56a6010-4a7c-44c5-a6df-4d7d74cf7877 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:26:33.085 134146 DEBUG oslo_concurrency.lockutils [req-b56a6010-4a7c-44c5-a6df-4d7d74cf7877 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:26:35.052 134140 DEBUG oslo_service.periodic_task [req-49686cfb-4a72-4d57-ad3d-312d129d548f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:26:35.057 134140 DEBUG oslo_concurrency.lockutils [req-4d523044-08b0-476a-934a-bc32ed03d4ca - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:26:35.057 134140 DEBUG oslo_concurrency.lockutils [req-4d523044-08b0-476a-934a-bc32ed03d4ca - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:26:41.019 134138 DEBUG oslo_service.periodic_task [req-debaeace-49f0-4da3-8715-b9a33ba334cc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:26:41.024 134138 DEBUG oslo_concurrency.lockutils [req-88ae8016-a12c-436e-bbdd-a4b3fe1fe3ea - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:26:41.024 134138 DEBUG oslo_concurrency.lockutils [req-88ae8016-a12c-436e-bbdd-a4b3fe1fe3ea - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:26:46.082 134145 DEBUG oslo_service.periodic_task [req-dfa9dc34-0846-4ee4-9bc2-7f72d544faa1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:26:46.086 134145 DEBUG oslo_concurrency.lockutils [req-b7d3e269-65ff-4b77-b9d8-01dc7058b6f5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:26:46.086 134145 DEBUG oslo_concurrency.lockutils [req-b7d3e269-65ff-4b77-b9d8-01dc7058b6f5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:27:03.082 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:03.083 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:03.083 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:03.098 134146 DEBUG oslo_service.periodic_task [req-b56a6010-4a7c-44c5-a6df-4d7d74cf7877 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:27:03.102 134146 DEBUG oslo_concurrency.lockutils [req-5fefe447-cf62-4b54-9b77-ac411c6a0452 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:27:03.102 134146 DEBUG oslo_concurrency.lockutils [req-5fefe447-cf62-4b54-9b77-ac411c6a0452 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:27:03.120 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:03.120 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:03.120 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:03.123 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:03.123 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:03.123 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:03.140 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:03.140 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:03.140 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:05.064 134140 DEBUG oslo_service.periodic_task [req-4d523044-08b0-476a-934a-bc32ed03d4ca - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:27:05.067 134140 DEBUG oslo_concurrency.lockutils [req-b91662ab-1d03-4b56-9eac-49201dd19a72 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:27:05.067 134140 DEBUG oslo_concurrency.lockutils [req-b91662ab-1d03-4b56-9eac-49201dd19a72 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:27:12.020 134138 DEBUG oslo_service.periodic_task [req-88ae8016-a12c-436e-bbdd-a4b3fe1fe3ea - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:27:12.024 134138 DEBUG oslo_concurrency.lockutils [req-432aab81-fb53-411c-832e-9139b7026818 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:27:12.025 134138 DEBUG oslo_concurrency.lockutils [req-432aab81-fb53-411c-832e-9139b7026818 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:27:16.092 134145 DEBUG oslo_service.periodic_task [req-b7d3e269-65ff-4b77-b9d8-01dc7058b6f5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:27:16.096 134145 DEBUG oslo_concurrency.lockutils [req-cfc8ec98-8222-472e-b094-bd323c72cec4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:27:16.096 134145 DEBUG oslo_concurrency.lockutils [req-cfc8ec98-8222-472e-b094-bd323c72cec4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:27:33.114 134146 DEBUG oslo_service.periodic_task [req-5fefe447-cf62-4b54-9b77-ac411c6a0452 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:27:33.118 134146 DEBUG oslo_concurrency.lockutils [req-6845907e-2d4d-4433-a382-eadddb4ebea8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:27:33.119 134146 DEBUG oslo_concurrency.lockutils [req-6845907e-2d4d-4433-a382-eadddb4ebea8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:27:35.072 134140 DEBUG oslo_service.periodic_task [req-b91662ab-1d03-4b56-9eac-49201dd19a72 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:27:35.083 134140 DEBUG oslo_concurrency.lockutils [req-c28797cd-8a8c-4af5-893f-e55aa89df668 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:27:35.083 134140 DEBUG oslo_concurrency.lockutils [req-c28797cd-8a8c-4af5-893f-e55aa89df668 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:27:43.019 134138 DEBUG oslo_service.periodic_task [req-432aab81-fb53-411c-832e-9139b7026818 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:27:43.023 134138 DEBUG oslo_concurrency.lockutils [req-f6b7e08c-e8ff-4344-b119-e39971504ecc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:27:43.024 134138 DEBUG oslo_concurrency.lockutils [req-f6b7e08c-e8ff-4344-b119-e39971504ecc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:27:47.052 134145 DEBUG oslo_service.periodic_task [req-cfc8ec98-8222-472e-b094-bd323c72cec4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:27:47.056 134145 DEBUG oslo_concurrency.lockutils [req-0a958bde-6d77-48b1-9aa4-6f02ef1dbc1e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:27:47.056 134145 DEBUG oslo_concurrency.lockutils [req-0a958bde-6d77-48b1-9aa4-6f02ef1dbc1e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:27:50.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d62d0b7afae2410bb69b66e0dd2381f4 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:27:50.825 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d62d0b7afae2410bb69b66e0dd2381f4 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:27:50.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:50.825 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d62d0b7afae2410bb69b66e0dd2381f4 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:27:50.825 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:50.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d62d0b7afae2410bb69b66e0dd2381f4 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:27:50.825 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d62d0b7afae2410bb69b66e0dd2381f4 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:27:50.825 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:50.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:50.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d62d0b7afae2410bb69b66e0dd2381f4 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:27:50.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:50.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:50.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:50.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:50.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:50.826 134145 DEBUG oslo_concurrency.lockutils [req-1ec32d76-0cc0-428c-b49c-271af7a09f14 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:27:50.826 134138 DEBUG oslo_concurrency.lockutils [req-1ec32d76-0cc0-428c-b49c-271af7a09f14 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:27:50.826 134145 DEBUG nova.scheduler.host_manager [req-1ec32d76-0cc0-428c-b49c-271af7a09f14 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:27:50.826 134146 DEBUG oslo_concurrency.lockutils [req-1ec32d76-0cc0-428c-b49c-271af7a09f14 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:27:50.827 134138 DEBUG nova.scheduler.host_manager [req-1ec32d76-0cc0-428c-b49c-271af7a09f14 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:27:50.827 134145 DEBUG oslo_concurrency.lockutils [req-1ec32d76-0cc0-428c-b49c-271af7a09f14 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:27:50.827 134146 DEBUG nova.scheduler.host_manager [req-1ec32d76-0cc0-428c-b49c-271af7a09f14 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:27:50.827 134138 DEBUG oslo_concurrency.lockutils [req-1ec32d76-0cc0-428c-b49c-271af7a09f14 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:27:50.827 134146 DEBUG oslo_concurrency.lockutils [req-1ec32d76-0cc0-428c-b49c-271af7a09f14 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:27:50.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:50.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:50.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:50.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d62d0b7afae2410bb69b66e0dd2381f4 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:27:50.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:50.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d62d0b7afae2410bb69b66e0dd2381f4 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:27:50.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:50.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:50.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:50.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:50.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:50.829 134140 DEBUG oslo_concurrency.lockutils [req-1ec32d76-0cc0-428c-b49c-271af7a09f14 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:27:50.829 134140 DEBUG nova.scheduler.host_manager [req-1ec32d76-0cc0-428c-b49c-271af7a09f14 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:27:50.829 134140 DEBUG oslo_concurrency.lockutils [req-1ec32d76-0cc0-428c-b49c-271af7a09f14 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:27:50.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:50.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:50.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:50.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:50.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:50.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:51.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:51.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:51.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:51.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:51.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:51.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:51.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:51.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:51.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:51.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:51.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:51.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:53.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:53.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:53.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:53.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:53.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:53.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:53.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:53.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:53.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:53.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:53.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:53.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:57.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:57.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:57.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:57.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:57.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:57.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:57.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:57.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:57.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:27:57.838 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:27:57.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:27:57.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:28:03.129 134146 DEBUG oslo_service.periodic_task [req-6845907e-2d4d-4433-a382-eadddb4ebea8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:28:03.133 134146 DEBUG oslo_concurrency.lockutils [req-7bac1fcc-f2e4-4cdc-8818-573e4499e84b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:28:03.133 134146 DEBUG oslo_concurrency.lockutils [req-7bac1fcc-f2e4-4cdc-8818-573e4499e84b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:28:05.089 134140 DEBUG oslo_service.periodic_task [req-c28797cd-8a8c-4af5-893f-e55aa89df668 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:28:05.094 134140 DEBUG oslo_concurrency.lockutils [req-b2f0c137-4dd4-4202-b044-3271f84bc574 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:28:05.094 134140 DEBUG oslo_concurrency.lockutils [req-b2f0c137-4dd4-4202-b044-3271f84bc574 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:28:05.836 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:28:05.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:28:05.836 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:28:05.836 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:28:05.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:28:05.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:28:05.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:28:05.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:28:05.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:28:05.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:28:05.840 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:28:05.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:28:13.035 134138 DEBUG oslo_service.periodic_task [req-f6b7e08c-e8ff-4344-b119-e39971504ecc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:28:13.039 134138 DEBUG oslo_concurrency.lockutils [req-f1156347-0c58-4fe4-bd96-b2cff5a01400 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:28:13.040 134138 DEBUG oslo_concurrency.lockutils [req-f1156347-0c58-4fe4-bd96-b2cff5a01400 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:28:17.062 134145 DEBUG oslo_service.periodic_task [req-0a958bde-6d77-48b1-9aa4-6f02ef1dbc1e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:28:17.066 134145 DEBUG oslo_concurrency.lockutils [req-63c340f2-7e20-4077-bbe7-efd3df0570da - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:28:17.067 134145 DEBUG oslo_concurrency.lockutils [req-63c340f2-7e20-4077-bbe7-efd3df0570da - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:28:21.838 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:28:21.838 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:28:21.838 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:28:21.839 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:28:21.839 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:28:21.839 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:28:21.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:28:21.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:28:21.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:28:21.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:28:21.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:28:21.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:28:34.070 134146 DEBUG oslo_service.periodic_task [req-7bac1fcc-f2e4-4cdc-8818-573e4499e84b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:28:34.073 134146 DEBUG oslo_concurrency.lockutils [req-b46c5207-3657-4a44-aab7-f297653e489e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:28:34.074 134146 DEBUG oslo_concurrency.lockutils [req-b46c5207-3657-4a44-aab7-f297653e489e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:28:36.052 134140 DEBUG oslo_service.periodic_task [req-b2f0c137-4dd4-4202-b044-3271f84bc574 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:28:36.057 134140 DEBUG oslo_concurrency.lockutils [req-1b850cde-306b-4b70-9d31-507482965d00 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:28:36.057 134140 DEBUG oslo_concurrency.lockutils [req-1b850cde-306b-4b70-9d31-507482965d00 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:28:43.050 134138 DEBUG oslo_service.periodic_task [req-f1156347-0c58-4fe4-bd96-b2cff5a01400 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:28:43.054 134138 DEBUG oslo_concurrency.lockutils [req-6125e928-2f75-4243-bf04-73ad6557e654 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:28:43.054 134138 DEBUG oslo_concurrency.lockutils [req-6125e928-2f75-4243-bf04-73ad6557e654 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:28:47.072 134145 DEBUG oslo_service.periodic_task [req-63c340f2-7e20-4077-bbe7-efd3df0570da - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:28:47.075 134145 DEBUG oslo_concurrency.lockutils [req-65e47d87-3029-426c-825d-fbad6c12eaf5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:28:47.076 134145 DEBUG oslo_concurrency.lockutils [req-65e47d87-3029-426c-825d-fbad6c12eaf5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:29:03.086 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:29:03.086 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:03.087 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:03.125 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:29:03.125 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:03.125 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:03.127 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:29:03.127 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:03.127 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:03.150 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:29:03.151 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:03.151 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:04.080 134146 DEBUG oslo_service.periodic_task [req-b46c5207-3657-4a44-aab7-f297653e489e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:29:04.084 134146 DEBUG oslo_concurrency.lockutils [req-cb5e217a-d21c-42c9-bf59-b5f049e17d2d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:29:04.084 134146 DEBUG oslo_concurrency.lockutils [req-cb5e217a-d21c-42c9-bf59-b5f049e17d2d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:29:06.063 134140 DEBUG oslo_service.periodic_task [req-1b850cde-306b-4b70-9d31-507482965d00 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:29:06.067 134140 DEBUG oslo_concurrency.lockutils [req-1cb2dedc-824a-4a5e-b390-3afc30a8711c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:29:06.067 134140 DEBUG oslo_concurrency.lockutils [req-1cb2dedc-824a-4a5e-b390-3afc30a8711c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:29:14.020 134138 DEBUG oslo_service.periodic_task [req-6125e928-2f75-4243-bf04-73ad6557e654 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:29:14.024 134138 DEBUG oslo_concurrency.lockutils [req-20f7f01f-daea-46bb-881f-6e9206085ae0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:29:14.024 134138 DEBUG oslo_concurrency.lockutils [req-20f7f01f-daea-46bb-881f-6e9206085ae0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:29:17.081 134145 DEBUG oslo_service.periodic_task [req-65e47d87-3029-426c-825d-fbad6c12eaf5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:29:17.085 134145 DEBUG oslo_concurrency.lockutils [req-8eef7165-893c-4df6-9563-033ec7fdfdf3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:29:17.085 134145 DEBUG oslo_concurrency.lockutils [req-8eef7165-893c-4df6-9563-033ec7fdfdf3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:29:34.089 134146 DEBUG oslo_service.periodic_task [req-cb5e217a-d21c-42c9-bf59-b5f049e17d2d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:29:34.094 134146 DEBUG oslo_concurrency.lockutils [req-d8f0757f-c1d4-4bc0-ad5d-55cfb3d2e587 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:29:34.094 134146 DEBUG oslo_concurrency.lockutils [req-d8f0757f-c1d4-4bc0-ad5d-55cfb3d2e587 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:29:36.072 134140 DEBUG oslo_service.periodic_task [req-1cb2dedc-824a-4a5e-b390-3afc30a8711c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:29:36.077 134140 DEBUG oslo_concurrency.lockutils [req-dd40c035-5fcc-4515-856e-fde12dc6f00e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:29:36.077 134140 DEBUG oslo_concurrency.lockutils [req-dd40c035-5fcc-4515-856e-fde12dc6f00e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:29:44.031 134138 DEBUG oslo_service.periodic_task [req-20f7f01f-daea-46bb-881f-6e9206085ae0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:29:44.035 134138 DEBUG oslo_concurrency.lockutils [req-f7314966-40a1-46f5-8ec6-17d4468cd157 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:29:44.035 134138 DEBUG oslo_concurrency.lockutils [req-f7314966-40a1-46f5-8ec6-17d4468cd157 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:29:47.091 134145 DEBUG oslo_service.periodic_task [req-8eef7165-893c-4df6-9563-033ec7fdfdf3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:29:47.095 134145 DEBUG oslo_concurrency.lockutils [req-09d562ab-7d92-4cec-aa16-4766294712f7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:29:47.095 134145 DEBUG oslo_concurrency.lockutils [req-09d562ab-7d92-4cec-aa16-4766294712f7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:29:55.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5d4f9c0338924bbfa7dca5e71ab7e13e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:29:55.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5d4f9c0338924bbfa7dca5e71ab7e13e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:29:55.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5d4f9c0338924bbfa7dca5e71ab7e13e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:29:55.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5d4f9c0338924bbfa7dca5e71ab7e13e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:29:55.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:55.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5d4f9c0338924bbfa7dca5e71ab7e13e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:29:55.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:55.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:55.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5d4f9c0338924bbfa7dca5e71ab7e13e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:29:55.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:55.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5d4f9c0338924bbfa7dca5e71ab7e13e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:29:55.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5d4f9c0338924bbfa7dca5e71ab7e13e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:29:55.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:55.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:55.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:55.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:55.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:55.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:55.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:55.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:55.830 134140 DEBUG oslo_concurrency.lockutils [req-9c36bc82-328b-48e8-85e8-578b5c2111d9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:29:55.830 134145 DEBUG oslo_concurrency.lockutils [req-9c36bc82-328b-48e8-85e8-578b5c2111d9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:29:55.831 134140 DEBUG nova.scheduler.host_manager [req-9c36bc82-328b-48e8-85e8-578b5c2111d9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:29:55.831 134138 DEBUG oslo_concurrency.lockutils [req-9c36bc82-328b-48e8-85e8-578b5c2111d9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:29:55.831 134146 DEBUG oslo_concurrency.lockutils [req-9c36bc82-328b-48e8-85e8-578b5c2111d9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:29:55.831 134145 DEBUG nova.scheduler.host_manager [req-9c36bc82-328b-48e8-85e8-578b5c2111d9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:29:55.831 134140 DEBUG oslo_concurrency.lockutils [req-9c36bc82-328b-48e8-85e8-578b5c2111d9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:29:55.831 134138 DEBUG nova.scheduler.host_manager [req-9c36bc82-328b-48e8-85e8-578b5c2111d9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:29:55.831 134146 DEBUG nova.scheduler.host_manager [req-9c36bc82-328b-48e8-85e8-578b5c2111d9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:29:55.831 134145 DEBUG oslo_concurrency.lockutils [req-9c36bc82-328b-48e8-85e8-578b5c2111d9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:29:55.831 134138 DEBUG oslo_concurrency.lockutils [req-9c36bc82-328b-48e8-85e8-578b5c2111d9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:29:55.831 134146 DEBUG oslo_concurrency.lockutils [req-9c36bc82-328b-48e8-85e8-578b5c2111d9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:29:55.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:29:55.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:55.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:55.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:29:55.831 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:29:55.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:29:55.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:55.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:55.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:55.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:55.832 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:55.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:56.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:29:56.833 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:29:56.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:29:56.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:56.833 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:29:56.833 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:56.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:56.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:56.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:56.834 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:56.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:56.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:58.835 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:29:58.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:58.836 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:58.836 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:29:58.836 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:58.836 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:29:58.836 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:58.836 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:58.837 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:29:58.836 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:29:58.837 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:29:58.837 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:30:02.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:30:02.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:30:02.838 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:30:02.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:30:02.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:30:02.839 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:30:02.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:30:02.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:30:02.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:30:02.841 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:30:02.841 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:30:02.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:30:04.101 134146 DEBUG oslo_service.periodic_task [req-d8f0757f-c1d4-4bc0-ad5d-55cfb3d2e587 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:30:04.105 134146 DEBUG oslo_concurrency.lockutils [req-c0f09a56-faba-4aff-bdd6-a19e7ad15892 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:30:04.105 134146 DEBUG oslo_concurrency.lockutils [req-c0f09a56-faba-4aff-bdd6-a19e7ad15892 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:30:06.082 134140 DEBUG oslo_service.periodic_task [req-dd40c035-5fcc-4515-856e-fde12dc6f00e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:30:06.086 134140 DEBUG oslo_concurrency.lockutils [req-14a2efa5-dfac-4846-ace6-955e87069051 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:30:06.087 134140 DEBUG oslo_concurrency.lockutils [req-14a2efa5-dfac-4846-ace6-955e87069051 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:30:10.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:30:10.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:30:10.845 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:30:10.845 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:30:10.845 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:30:10.845 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:30:10.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:30:10.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:30:10.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:30:10.847 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:30:10.848 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:30:10.848 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:30:15.019 134138 DEBUG oslo_service.periodic_task [req-f7314966-40a1-46f5-8ec6-17d4468cd157 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:30:15.024 134138 DEBUG oslo_concurrency.lockutils [req-5b531773-53b4-4910-b782-78a87bd733ae - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:30:15.024 134138 DEBUG oslo_concurrency.lockutils [req-5b531773-53b4-4910-b782-78a87bd733ae - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:30:18.052 134145 DEBUG oslo_service.periodic_task [req-09d562ab-7d92-4cec-aa16-4766294712f7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:30:18.057 134145 DEBUG oslo_concurrency.lockutils [req-7365f786-3a1f-474c-9db3-b377f8c18210 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:30:18.057 134145 DEBUG oslo_concurrency.lockutils [req-7365f786-3a1f-474c-9db3-b377f8c18210 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:30:26.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:30:26.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:30:26.846 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:30:26.847 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:30:26.847 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:30:26.847 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:30:26.849 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:30:26.849 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:30:26.849 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:30:26.850 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:30:26.850 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:30:26.850 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:30:34.116 134146 DEBUG oslo_service.periodic_task [req-c0f09a56-faba-4aff-bdd6-a19e7ad15892 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:30:34.120 134146 DEBUG oslo_concurrency.lockutils [req-b8d249ca-b5da-454f-a440-ae2a995929fa - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:30:34.121 134146 DEBUG oslo_concurrency.lockutils [req-b8d249ca-b5da-454f-a440-ae2a995929fa - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:30:37.052 134140 DEBUG oslo_service.periodic_task [req-14a2efa5-dfac-4846-ace6-955e87069051 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:30:37.056 134140 DEBUG oslo_concurrency.lockutils [req-3a5ce68f-17c0-44f6-8aea-1668b8dbdc1f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:30:37.057 134140 DEBUG oslo_concurrency.lockutils [req-3a5ce68f-17c0-44f6-8aea-1668b8dbdc1f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:30:45.030 134138 DEBUG oslo_service.periodic_task [req-5b531773-53b4-4910-b782-78a87bd733ae - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:30:45.035 134138 DEBUG oslo_concurrency.lockutils [req-46cab3cd-e03f-4f02-a94b-4c5fd24d5fa7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:30:45.035 134138 DEBUG oslo_concurrency.lockutils [req-46cab3cd-e03f-4f02-a94b-4c5fd24d5fa7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:30:49.052 134145 DEBUG oslo_service.periodic_task [req-7365f786-3a1f-474c-9db3-b377f8c18210 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:30:49.056 134145 DEBUG oslo_concurrency.lockutils [req-89e7a194-9d04-409a-9922-3b4c005e52f0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:30:49.057 134145 DEBUG oslo_concurrency.lockutils [req-89e7a194-9d04-409a-9922-3b4c005e52f0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:31:03.086 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:31:03.086 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:03.087 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:03.126 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:31:03.126 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:03.126 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:03.130 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:31:03.130 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:03.131 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:03.154 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:31:03.154 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:03.154 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:04.126 134146 DEBUG oslo_service.periodic_task [req-b8d249ca-b5da-454f-a440-ae2a995929fa - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:31:04.131 134146 DEBUG oslo_concurrency.lockutils [req-e6dae470-1381-4461-bc17-77a341025a07 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:31:04.132 134146 DEBUG oslo_concurrency.lockutils [req-e6dae470-1381-4461-bc17-77a341025a07 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:31:08.052 134140 DEBUG oslo_service.periodic_task [req-3a5ce68f-17c0-44f6-8aea-1668b8dbdc1f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:31:08.057 134140 DEBUG oslo_concurrency.lockutils [req-3c806b12-9e4f-409a-97b1-788fcacd1492 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:31:08.057 134140 DEBUG oslo_concurrency.lockutils [req-3c806b12-9e4f-409a-97b1-788fcacd1492 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:31:15.041 134138 DEBUG oslo_service.periodic_task [req-46cab3cd-e03f-4f02-a94b-4c5fd24d5fa7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:31:15.047 134138 DEBUG oslo_concurrency.lockutils [req-87b65dd4-67d6-45c3-94a1-54ca125e8a2e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:31:15.047 134138 DEBUG oslo_concurrency.lockutils [req-87b65dd4-67d6-45c3-94a1-54ca125e8a2e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:31:20.052 134145 DEBUG oslo_service.periodic_task [req-89e7a194-9d04-409a-9922-3b4c005e52f0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:31:20.058 134145 DEBUG oslo_concurrency.lockutils [req-c025511a-5677-447c-9496-d31bc8f8f94b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:31:20.058 134145 DEBUG oslo_concurrency.lockutils [req-c025511a-5677-447c-9496-d31bc8f8f94b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:31:34.137 134146 DEBUG oslo_service.periodic_task [req-e6dae470-1381-4461-bc17-77a341025a07 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:31:34.141 134146 DEBUG oslo_concurrency.lockutils [req-4279a345-df86-43d4-b3da-2069cd406f4e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:31:34.142 134146 DEBUG oslo_concurrency.lockutils [req-4279a345-df86-43d4-b3da-2069cd406f4e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:31:39.052 134140 DEBUG oslo_service.periodic_task [req-3c806b12-9e4f-409a-97b1-788fcacd1492 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:31:39.092 134140 DEBUG oslo_concurrency.lockutils [req-a66bafba-9d7f-4c9d-9fbc-60520c0ecaf5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:31:39.093 134140 DEBUG oslo_concurrency.lockutils [req-a66bafba-9d7f-4c9d-9fbc-60520c0ecaf5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:31:45.056 134138 DEBUG oslo_service.periodic_task [req-87b65dd4-67d6-45c3-94a1-54ca125e8a2e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:31:45.061 134138 DEBUG oslo_concurrency.lockutils [req-7ded2296-7740-4314-80c3-f988a5835218 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:31:45.061 134138 DEBUG oslo_concurrency.lockutils [req-7ded2296-7740-4314-80c3-f988a5835218 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:31:51.051 134145 DEBUG oslo_service.periodic_task [req-c025511a-5677-447c-9496-d31bc8f8f94b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:31:51.056 134145 DEBUG oslo_concurrency.lockutils [req-40363ee8-b046-4ebe-b613-db507f41d926 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:31:51.056 134145 DEBUG oslo_concurrency.lockutils [req-40363ee8-b046-4ebe-b613-db507f41d926 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:31:56.826 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 004de9b4a6084215bfa24f1266de5260 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:31:56.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 004de9b4a6084215bfa24f1266de5260 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:31:56.826 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 004de9b4a6084215bfa24f1266de5260 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:31:56.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 004de9b4a6084215bfa24f1266de5260 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:31:56.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:56.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:56.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 004de9b4a6084215bfa24f1266de5260 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:31:56.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 004de9b4a6084215bfa24f1266de5260 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:31:56.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:56.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:56.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 004de9b4a6084215bfa24f1266de5260 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:31:56.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:56.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 004de9b4a6084215bfa24f1266de5260 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:31:56.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:56.827 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:56.827 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:56.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:56.827 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:56.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:56.828 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:56.828 134140 DEBUG oslo_concurrency.lockutils [req-607f579a-f1ad-4487-91c5-f2c548553218 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:31:56.828 134146 DEBUG oslo_concurrency.lockutils [req-607f579a-f1ad-4487-91c5-f2c548553218 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:31:56.828 134140 DEBUG nova.scheduler.host_manager [req-607f579a-f1ad-4487-91c5-f2c548553218 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:31:56.828 134146 DEBUG nova.scheduler.host_manager [req-607f579a-f1ad-4487-91c5-f2c548553218 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:31:56.828 134145 DEBUG oslo_concurrency.lockutils [req-607f579a-f1ad-4487-91c5-f2c548553218 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:31:56.828 134140 DEBUG oslo_concurrency.lockutils [req-607f579a-f1ad-4487-91c5-f2c548553218 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:31:56.828 134145 DEBUG nova.scheduler.host_manager [req-607f579a-f1ad-4487-91c5-f2c548553218 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:31:56.828 134146 DEBUG oslo_concurrency.lockutils [req-607f579a-f1ad-4487-91c5-f2c548553218 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:31:56.828 134145 DEBUG oslo_concurrency.lockutils [req-607f579a-f1ad-4487-91c5-f2c548553218 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:31:56.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:31:56.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:31:56.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:56.829 134138 DEBUG oslo_concurrency.lockutils [req-607f579a-f1ad-4487-91c5-f2c548553218 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:31:56.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:56.829 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:56.829 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:56.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:31:56.829 134138 DEBUG nova.scheduler.host_manager [req-607f579a-f1ad-4487-91c5-f2c548553218 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:31:56.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:56.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:56.829 134138 DEBUG oslo_concurrency.lockutils [req-607f579a-f1ad-4487-91c5-f2c548553218 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:31:56.830 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:31:56.835 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:56.835 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:57.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:31:57.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:57.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:31:57.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:57.830 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:31:57.830 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:57.831 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:57.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:57.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:57.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:31:57.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:57.837 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:59.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:31:59.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:59.832 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:59.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:31:59.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:31:59.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:59.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:59.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:59.833 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:31:59.839 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:31:59.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:31:59.840 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:32:03.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:32:03.834 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:32:03.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:32:03.835 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:32:03.835 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:32:03.835 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:32:03.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:32:03.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:32:03.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:32:03.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:32:03.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:32:03.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:32:04.149 134146 DEBUG oslo_service.periodic_task [req-4279a345-df86-43d4-b3da-2069cd406f4e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:32:04.153 134146 DEBUG oslo_concurrency.lockutils [req-11fa3040-df97-4864-9fb1-ed50bfaa2b97 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:32:04.154 134146 DEBUG oslo_concurrency.lockutils [req-11fa3040-df97-4864-9fb1-ed50bfaa2b97 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:32:10.053 134140 DEBUG oslo_service.periodic_task [req-a66bafba-9d7f-4c9d-9fbc-60520c0ecaf5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:32:10.057 134140 DEBUG oslo_concurrency.lockutils [req-955cbc46-9cab-41ca-9474-ee60975bb9ce - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:32:10.058 134140 DEBUG oslo_concurrency.lockutils [req-955cbc46-9cab-41ca-9474-ee60975bb9ce - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:32:11.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:32:11.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:32:11.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:32:11.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:32:11.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:32:11.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:32:11.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:32:11.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:32:11.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:32:11.850 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:32:11.851 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:32:11.851 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:32:16.020 134138 DEBUG oslo_service.periodic_task [req-7ded2296-7740-4314-80c3-f988a5835218 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:32:16.024 134138 DEBUG oslo_concurrency.lockutils [req-5f051d20-0863-47d5-a6bb-e84aff32c660 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:32:16.024 134138 DEBUG oslo_concurrency.lockutils [req-5f051d20-0863-47d5-a6bb-e84aff32c660 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:32:21.066 134145 DEBUG oslo_service.periodic_task [req-40363ee8-b046-4ebe-b613-db507f41d926 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:32:21.070 134145 DEBUG oslo_concurrency.lockutils [req-7549d429-6898-47fd-b892-fda5eae72882 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:32:21.070 134145 DEBUG oslo_concurrency.lockutils [req-7549d429-6898-47fd-b892-fda5eae72882 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:32:27.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:32:27.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:32:27.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:32:27.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:32:27.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:32:27.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:32:27.844 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:32:27.844 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:32:27.844 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:32:27.853 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:32:27.854 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:32:27.854 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:32:34.162 134146 DEBUG oslo_service.periodic_task [req-11fa3040-df97-4864-9fb1-ed50bfaa2b97 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:32:34.166 134146 DEBUG oslo_concurrency.lockutils [req-592ed682-ec31-49e5-bb16-4a38712150c4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:32:34.166 134146 DEBUG oslo_concurrency.lockutils [req-592ed682-ec31-49e5-bb16-4a38712150c4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:32:40.067 134140 DEBUG oslo_service.periodic_task [req-955cbc46-9cab-41ca-9474-ee60975bb9ce - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:32:40.073 134140 DEBUG oslo_concurrency.lockutils [req-51ab6178-eab4-4cbe-9ac0-783c0126785a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:32:40.073 134140 DEBUG oslo_concurrency.lockutils [req-51ab6178-eab4-4cbe-9ac0-783c0126785a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:32:46.028 134138 DEBUG oslo_service.periodic_task [req-5f051d20-0863-47d5-a6bb-e84aff32c660 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:32:46.033 134138 DEBUG oslo_concurrency.lockutils [req-0569f79e-5937-4428-ae5c-48ef513bbd54 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:32:46.033 134138 DEBUG oslo_concurrency.lockutils [req-0569f79e-5937-4428-ae5c-48ef513bbd54 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:32:51.078 134145 DEBUG oslo_service.periodic_task [req-7549d429-6898-47fd-b892-fda5eae72882 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:32:51.083 134145 DEBUG oslo_concurrency.lockutils [req-98aac0ae-a0a3-4728-81b0-7d008bd6b3d2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:32:51.083 134145 DEBUG oslo_concurrency.lockutils [req-98aac0ae-a0a3-4728-81b0-7d008bd6b3d2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:33:03.090 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:33:03.090 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:33:03.090 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:33:03.130 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:33:03.131 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:33:03.131 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:33:03.133 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:33:03.134 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:33:03.134 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:33:03.156 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:33:03.156 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:33:03.156 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:33:04.172 134146 DEBUG oslo_service.periodic_task [req-592ed682-ec31-49e5-bb16-4a38712150c4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:33:04.177 134146 DEBUG oslo_concurrency.lockutils [req-dd94e347-df9a-4d69-83f1-9385fcabe0d4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:33:04.177 134146 DEBUG oslo_concurrency.lockutils [req-dd94e347-df9a-4d69-83f1-9385fcabe0d4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:33:10.082 134140 DEBUG oslo_service.periodic_task [req-51ab6178-eab4-4cbe-9ac0-783c0126785a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:33:10.086 134140 DEBUG oslo_concurrency.lockutils [req-cd3165c6-ba10-4ec7-a261-eb1b64a84b75 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:33:10.087 134140 DEBUG oslo_concurrency.lockutils [req-cd3165c6-ba10-4ec7-a261-eb1b64a84b75 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:33:16.040 134138 DEBUG oslo_service.periodic_task [req-0569f79e-5937-4428-ae5c-48ef513bbd54 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:33:16.045 134138 DEBUG oslo_concurrency.lockutils [req-16cf3ec4-3a6e-43aa-9ac6-2810842795ad - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:33:16.045 134138 DEBUG oslo_concurrency.lockutils [req-16cf3ec4-3a6e-43aa-9ac6-2810842795ad - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:33:21.092 134145 DEBUG oslo_service.periodic_task [req-98aac0ae-a0a3-4728-81b0-7d008bd6b3d2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:33:21.096 134145 DEBUG oslo_concurrency.lockutils [req-7c7ec03d-68d4-4b9e-9dc7-7e2c6136961e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:33:21.096 134145 DEBUG oslo_concurrency.lockutils [req-7c7ec03d-68d4-4b9e-9dc7-7e2c6136961e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:33:35.069 134146 DEBUG oslo_service.periodic_task [req-dd94e347-df9a-4d69-83f1-9385fcabe0d4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:33:35.073 134146 DEBUG oslo_concurrency.lockutils [req-581b44e3-1dcd-4b3f-a216-c9462b34ee70 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:33:35.073 134146 DEBUG oslo_concurrency.lockutils [req-581b44e3-1dcd-4b3f-a216-c9462b34ee70 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:33:40.096 134140 DEBUG oslo_service.periodic_task [req-cd3165c6-ba10-4ec7-a261-eb1b64a84b75 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:33:40.101 134140 DEBUG oslo_concurrency.lockutils [req-0c1f62a0-1b7f-4c8d-aa81-85a432931602 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:33:40.101 134140 DEBUG oslo_concurrency.lockutils [req-0c1f62a0-1b7f-4c8d-aa81-85a432931602 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:33:46.050 134138 DEBUG oslo_service.periodic_task [req-16cf3ec4-3a6e-43aa-9ac6-2810842795ad - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:33:46.054 134138 DEBUG oslo_concurrency.lockutils [req-7dd66fa1-088a-41a5-8a9d-cbfd1643ea70 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:33:46.055 134138 DEBUG oslo_concurrency.lockutils [req-7dd66fa1-088a-41a5-8a9d-cbfd1643ea70 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:33:52.052 134145 DEBUG oslo_service.periodic_task [req-7c7ec03d-68d4-4b9e-9dc7-7e2c6136961e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:33:52.056 134145 DEBUG oslo_concurrency.lockutils [req-5163ad22-b8d2-432c-af4a-553f7b227ce0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:33:52.056 134145 DEBUG oslo_concurrency.lockutils [req-5163ad22-b8d2-432c-af4a-553f7b227ce0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:34:06.069 134146 DEBUG oslo_service.periodic_task [req-581b44e3-1dcd-4b3f-a216-c9462b34ee70 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:34:06.074 134146 DEBUG oslo_concurrency.lockutils [req-e192c61f-e4b8-42d3-b961-86d82134d917 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:34:06.074 134146 DEBUG oslo_concurrency.lockutils [req-e192c61f-e4b8-42d3-b961-86d82134d917 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:34:07.092 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:34:07.093 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:34:07.093 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:34:07.134 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:34:07.134 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:34:07.134 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:34:07.136 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:34:07.136 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:34:07.136 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:34:07.158 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:34:07.158 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:34:07.158 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:34:11.052 134140 DEBUG oslo_service.periodic_task [req-0c1f62a0-1b7f-4c8d-aa81-85a432931602 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:34:11.056 134140 DEBUG oslo_concurrency.lockutils [req-877a9688-7c29-4fa7-aa2b-db720ca14816 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:34:11.056 134140 DEBUG oslo_concurrency.lockutils [req-877a9688-7c29-4fa7-aa2b-db720ca14816 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:34:16.061 134138 DEBUG oslo_service.periodic_task [req-7dd66fa1-088a-41a5-8a9d-cbfd1643ea70 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:34:16.065 134138 DEBUG oslo_concurrency.lockutils [req-cf3bc2ce-24b3-48ab-a4b7-90b43675aa89 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:34:16.066 134138 DEBUG oslo_concurrency.lockutils [req-cf3bc2ce-24b3-48ab-a4b7-90b43675aa89 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:34:22.063 134145 DEBUG oslo_service.periodic_task [req-5163ad22-b8d2-432c-af4a-553f7b227ce0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:34:22.067 134145 DEBUG oslo_concurrency.lockutils [req-a2e2fd20-f7f2-4997-8510-e083578463b2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:34:22.067 134145 DEBUG oslo_concurrency.lockutils [req-a2e2fd20-f7f2-4997-8510-e083578463b2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:34:36.080 134146 DEBUG oslo_service.periodic_task [req-e192c61f-e4b8-42d3-b961-86d82134d917 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:34:36.084 134146 DEBUG oslo_concurrency.lockutils [req-2eebdea1-aeeb-4992-9781-c69b75b0d35e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:34:36.085 134146 DEBUG oslo_concurrency.lockutils [req-2eebdea1-aeeb-4992-9781-c69b75b0d35e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:34:41.065 134140 DEBUG oslo_service.periodic_task [req-877a9688-7c29-4fa7-aa2b-db720ca14816 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:34:41.069 134140 DEBUG oslo_concurrency.lockutils [req-ac266664-f69b-4098-939a-9124519a040f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:34:41.069 134140 DEBUG oslo_concurrency.lockutils [req-ac266664-f69b-4098-939a-9124519a040f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:34:46.070 134138 DEBUG oslo_service.periodic_task [req-cf3bc2ce-24b3-48ab-a4b7-90b43675aa89 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:34:46.074 134138 DEBUG oslo_concurrency.lockutils [req-53da91e1-2fe5-470e-99f6-8df75c901da4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:34:46.074 134138 DEBUG oslo_concurrency.lockutils [req-53da91e1-2fe5-470e-99f6-8df75c901da4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:34:52.077 134145 DEBUG oslo_service.periodic_task [req-a2e2fd20-f7f2-4997-8510-e083578463b2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:34:52.083 134145 DEBUG oslo_concurrency.lockutils [req-a25d4bdb-72a5-4023-9dbc-a831d2b70d3f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:34:52.083 134145 DEBUG oslo_concurrency.lockutils [req-a25d4bdb-72a5-4023-9dbc-a831d2b70d3f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:35:06.090 134146 DEBUG oslo_service.periodic_task [req-2eebdea1-aeeb-4992-9781-c69b75b0d35e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:35:06.094 134146 DEBUG oslo_concurrency.lockutils [req-3f352bcd-0829-4fdf-aeae-bbcbd9458492 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:35:06.095 134146 DEBUG oslo_concurrency.lockutils [req-3f352bcd-0829-4fdf-aeae-bbcbd9458492 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:35:12.053 134140 DEBUG oslo_service.periodic_task [req-ac266664-f69b-4098-939a-9124519a040f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:35:12.058 134140 DEBUG oslo_concurrency.lockutils [req-784ffe58-d781-4260-b753-953219f2f8a4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:35:12.058 134140 DEBUG oslo_concurrency.lockutils [req-784ffe58-d781-4260-b753-953219f2f8a4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:35:16.081 134138 DEBUG oslo_service.periodic_task [req-53da91e1-2fe5-470e-99f6-8df75c901da4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:35:16.085 134138 DEBUG oslo_concurrency.lockutils [req-21268663-0328-4add-be8c-fdea9530f9b6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:35:16.086 134138 DEBUG oslo_concurrency.lockutils [req-21268663-0328-4add-be8c-fdea9530f9b6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:35:22.100 134145 DEBUG oslo_service.periodic_task [req-a25d4bdb-72a5-4023-9dbc-a831d2b70d3f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:35:22.105 134145 DEBUG oslo_concurrency.lockutils [req-cc8e4c6f-4504-4190-890d-7e20212d4813 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:35:22.105 134145 DEBUG oslo_concurrency.lockutils [req-cc8e4c6f-4504-4190-890d-7e20212d4813 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:35:37.069 134146 DEBUG oslo_service.periodic_task [req-3f352bcd-0829-4fdf-aeae-bbcbd9458492 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:35:37.076 134146 DEBUG oslo_concurrency.lockutils [req-a8677fb0-b841-49e9-899c-2f69ac2f22b3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:35:37.076 134146 DEBUG oslo_concurrency.lockutils [req-a8677fb0-b841-49e9-899c-2f69ac2f22b3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:35:42.069 134140 DEBUG oslo_service.periodic_task [req-784ffe58-d781-4260-b753-953219f2f8a4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:35:42.090 134140 DEBUG oslo_concurrency.lockutils [req-306afd11-dcf2-4378-ade4-571aef96d816 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:35:42.134 134140 DEBUG oslo_concurrency.lockutils [req-306afd11-dcf2-4378-ade4-571aef96d816 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.044s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:35:46.090 134138 DEBUG oslo_service.periodic_task [req-21268663-0328-4add-be8c-fdea9530f9b6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:35:46.094 134138 DEBUG oslo_concurrency.lockutils [req-c32eb7e6-ec89-47e2-a224-e95188805733 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:35:46.094 134138 DEBUG oslo_concurrency.lockutils [req-c32eb7e6-ec89-47e2-a224-e95188805733 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:35:53.052 134145 DEBUG oslo_service.periodic_task [req-cc8e4c6f-4504-4190-890d-7e20212d4813 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:35:53.056 134145 DEBUG oslo_concurrency.lockutils [req-3cd4b4ba-f418-4f9d-aa56-21f5e0762e84 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:35:53.057 134145 DEBUG oslo_concurrency.lockutils [req-3cd4b4ba-f418-4f9d-aa56-21f5e0762e84 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:36:03.614 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5efc57dadd1a44bb80c42ea44e003282 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:36:03.614 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5efc57dadd1a44bb80c42ea44e003282 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:36:03.614 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5efc57dadd1a44bb80c42ea44e003282 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:36:03.614 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5efc57dadd1a44bb80c42ea44e003282 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:36:03.614 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:03.614 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:03.614 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:03.614 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5efc57dadd1a44bb80c42ea44e003282 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:36:03.614 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:03.615 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5efc57dadd1a44bb80c42ea44e003282 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:36:03.615 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5efc57dadd1a44bb80c42ea44e003282 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:36:03.615 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5efc57dadd1a44bb80c42ea44e003282 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:36:03.615 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:03.615 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:03.615 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:03.615 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:03.615 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:03.615 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:03.615 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:03.615 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:03.615 134140 DEBUG oslo_concurrency.lockutils [req-df2a6f07-c959-4ef4-920e-233e6abfebe9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:36:03.615 134138 DEBUG oslo_concurrency.lockutils [req-df2a6f07-c959-4ef4-920e-233e6abfebe9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:36:03.615 134145 DEBUG oslo_concurrency.lockutils [req-df2a6f07-c959-4ef4-920e-233e6abfebe9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:36:03.616 134140 DEBUG nova.scheduler.host_manager [req-df2a6f07-c959-4ef4-920e-233e6abfebe9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:36:03.616 134146 DEBUG oslo_concurrency.lockutils [req-df2a6f07-c959-4ef4-920e-233e6abfebe9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:36:03.616 134138 DEBUG nova.scheduler.host_manager [req-df2a6f07-c959-4ef4-920e-233e6abfebe9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:36:03.616 134140 DEBUG oslo_concurrency.lockutils [req-df2a6f07-c959-4ef4-920e-233e6abfebe9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:36:03.616 134145 DEBUG nova.scheduler.host_manager [req-df2a6f07-c959-4ef4-920e-233e6abfebe9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:36:03.616 134146 DEBUG nova.scheduler.host_manager [req-df2a6f07-c959-4ef4-920e-233e6abfebe9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:36:03.616 134138 DEBUG oslo_concurrency.lockutils [req-df2a6f07-c959-4ef4-920e-233e6abfebe9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:36:03.616 134145 DEBUG oslo_concurrency.lockutils [req-df2a6f07-c959-4ef4-920e-233e6abfebe9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:36:03.616 134146 DEBUG oslo_concurrency.lockutils [req-df2a6f07-c959-4ef4-920e-233e6abfebe9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:36:03.616 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:03.616 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:03.616 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:03.616 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:03.616 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:03.616 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:03.616 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:03.617 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:03.617 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:03.617 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:03.617 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:03.617 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:04.618 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:04.618 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:04.618 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:04.618 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:04.618 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:04.618 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:04.618 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:04.618 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:04.618 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:04.618 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:04.618 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:04.618 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:06.619 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:06.619 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:06.620 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:06.620 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:06.620 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:06.621 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:06.621 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:06.621 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:06.621 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:06.621 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:06.622 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:06.622 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:08.069 134146 DEBUG oslo_service.periodic_task [req-a8677fb0-b841-49e9-899c-2f69ac2f22b3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:36:08.073 134146 DEBUG oslo_concurrency.lockutils [req-c79b1a2c-4205-45d2-a2d7-ff004b089547 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:36:08.073 134146 DEBUG oslo_concurrency.lockutils [req-c79b1a2c-4205-45d2-a2d7-ff004b089547 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:36:10.622 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:10.623 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:10.623 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:10.623 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:10.623 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:10.623 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:10.624 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:10.625 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:10.625 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:10.626 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:10.626 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:10.626 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:12.141 134140 DEBUG oslo_service.periodic_task [req-306afd11-dcf2-4378-ade4-571aef96d816 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:36:12.145 134140 DEBUG oslo_concurrency.lockutils [req-2f66d26d-a07f-400e-b79c-b1ae6e09767f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:36:12.146 134140 DEBUG oslo_concurrency.lockutils [req-2f66d26d-a07f-400e-b79c-b1ae6e09767f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:36:16.100 134138 DEBUG oslo_service.periodic_task [req-c32eb7e6-ec89-47e2-a224-e95188805733 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:36:16.105 134138 DEBUG oslo_concurrency.lockutils [req-26cab0c7-81b2-4c16-93ba-819b2803b4cc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:36:16.105 134138 DEBUG oslo_concurrency.lockutils [req-26cab0c7-81b2-4c16-93ba-819b2803b4cc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:36:18.626 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:18.626 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:18.626 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:18.626 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:18.626 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:18.626 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:18.626 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:18.626 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:18.626 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:18.630 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:18.630 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:18.630 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:23.065 134145 DEBUG oslo_service.periodic_task [req-3cd4b4ba-f418-4f9d-aa56-21f5e0762e84 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:36:23.070 134145 DEBUG oslo_concurrency.lockutils [req-c4bb4004-d9a0-4aa4-a12b-a70fe2405aec - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:36:23.070 134145 DEBUG oslo_concurrency.lockutils [req-c4bb4004-d9a0-4aa4-a12b-a70fe2405aec - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:36:34.628 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:34.628 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:34.629 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:34.629 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:34.629 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:34.629 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:34.629 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:34.629 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:34.629 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:34.633 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:36:34.633 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:36:34.633 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:36:38.082 134146 DEBUG oslo_service.periodic_task [req-c79b1a2c-4205-45d2-a2d7-ff004b089547 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:36:38.086 134146 DEBUG oslo_concurrency.lockutils [req-814c8b3b-9032-4952-8c99-ae492cab8fd1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:36:38.088 134146 DEBUG oslo_concurrency.lockutils [req-814c8b3b-9032-4952-8c99-ae492cab8fd1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:36:43.052 134140 DEBUG oslo_service.periodic_task [req-2f66d26d-a07f-400e-b79c-b1ae6e09767f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:36:43.057 134140 DEBUG oslo_concurrency.lockutils [req-e17d8669-00d5-4872-b4a6-9f93aa93b7a9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:36:43.110 134140 DEBUG oslo_concurrency.lockutils [req-e17d8669-00d5-4872-b4a6-9f93aa93b7a9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.052s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:36:47.020 134138 DEBUG oslo_service.periodic_task [req-26cab0c7-81b2-4c16-93ba-819b2803b4cc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:36:47.025 134138 DEBUG oslo_concurrency.lockutils [req-fa53979c-18cd-4986-bb28-17e9eda1fecb - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:36:47.025 134138 DEBUG oslo_concurrency.lockutils [req-fa53979c-18cd-4986-bb28-17e9eda1fecb - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:36:53.077 134145 DEBUG oslo_service.periodic_task [req-c4bb4004-d9a0-4aa4-a12b-a70fe2405aec - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:36:53.082 134145 DEBUG oslo_concurrency.lockutils [req-86d78d6c-3b86-4984-8ad1-54a51f91face - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:36:53.083 134145 DEBUG oslo_concurrency.lockutils [req-86d78d6c-3b86-4984-8ad1-54a51f91face - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:37:06.630 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:37:06.630 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:37:06.630 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:37:06.631 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:37:06.632 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:37:06.632 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:37:06.632 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:37:06.632 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:37:06.633 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:37:06.636 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:37:06.636 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:37:06.636 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:37:08.093 134146 DEBUG oslo_service.periodic_task [req-814c8b3b-9032-4952-8c99-ae492cab8fd1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:37:08.098 134146 DEBUG oslo_concurrency.lockutils [req-051efe2e-7549-4f8e-ae3d-7984c1fe92e4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:37:08.098 134146 DEBUG oslo_concurrency.lockutils [req-051efe2e-7549-4f8e-ae3d-7984c1fe92e4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:37:14.052 134140 DEBUG oslo_service.periodic_task [req-e17d8669-00d5-4872-b4a6-9f93aa93b7a9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:37:14.056 134140 DEBUG oslo_concurrency.lockutils [req-94b1bea8-c955-421c-bf70-8e6a17162eda - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:37:14.056 134140 DEBUG oslo_concurrency.lockutils [req-94b1bea8-c955-421c-bf70-8e6a17162eda - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:37:17.032 134138 DEBUG oslo_service.periodic_task [req-fa53979c-18cd-4986-bb28-17e9eda1fecb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:37:17.037 134138 DEBUG oslo_concurrency.lockutils [req-3bc238cb-ccb2-40f8-bdc3-07b0a3386d40 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:37:17.038 134138 DEBUG oslo_concurrency.lockutils [req-3bc238cb-ccb2-40f8-bdc3-07b0a3386d40 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:37:23.087 134145 DEBUG oslo_service.periodic_task [req-86d78d6c-3b86-4984-8ad1-54a51f91face - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:37:23.099 134145 DEBUG oslo_concurrency.lockutils [req-0ac416f1-db30-411b-bd3c-1a68f22cb6e5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:37:23.099 134145 DEBUG oslo_concurrency.lockutils [req-0ac416f1-db30-411b-bd3c-1a68f22cb6e5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:37:38.105 134146 DEBUG oslo_service.periodic_task [req-051efe2e-7549-4f8e-ae3d-7984c1fe92e4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:37:38.109 134146 DEBUG oslo_concurrency.lockutils [req-e00994a5-1e39-4ba2-97d7-bbd7cda18e37 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:37:38.110 134146 DEBUG oslo_concurrency.lockutils [req-e00994a5-1e39-4ba2-97d7-bbd7cda18e37 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:37:45.051 134140 DEBUG oslo_service.periodic_task [req-94b1bea8-c955-421c-bf70-8e6a17162eda - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:37:45.056 134140 DEBUG oslo_concurrency.lockutils [req-413136b6-2974-49e8-9fb9-35feae74df7c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:37:45.056 134140 DEBUG oslo_concurrency.lockutils [req-413136b6-2974-49e8-9fb9-35feae74df7c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:37:47.045 134138 DEBUG oslo_service.periodic_task [req-3bc238cb-ccb2-40f8-bdc3-07b0a3386d40 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:37:47.049 134138 DEBUG oslo_concurrency.lockutils [req-337f95da-7cca-4fc7-83d1-0937253a7e59 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:37:47.050 134138 DEBUG oslo_concurrency.lockutils [req-337f95da-7cca-4fc7-83d1-0937253a7e59 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:37:54.052 134145 DEBUG oslo_service.periodic_task [req-0ac416f1-db30-411b-bd3c-1a68f22cb6e5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:37:54.057 134145 DEBUG oslo_concurrency.lockutils [req-39e0128a-6a03-4132-901e-0ddd8855ce9d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:37:54.057 134145 DEBUG oslo_concurrency.lockutils [req-39e0128a-6a03-4132-901e-0ddd8855ce9d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:38:08.117 134146 DEBUG oslo_service.periodic_task [req-e00994a5-1e39-4ba2-97d7-bbd7cda18e37 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:38:08.122 134146 DEBUG oslo_concurrency.lockutils [req-7c721804-88f2-4cb6-874f-e24a3a128c8c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:38:08.122 134146 DEBUG oslo_concurrency.lockutils [req-7c721804-88f2-4cb6-874f-e24a3a128c8c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:38:10.634 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:10.635 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:10.635 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:10.637 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:10.638 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:10.638 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:10.638 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:10.638 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:10.638 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:10.643 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:10.644 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:10.644 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:15.065 134140 DEBUG oslo_service.periodic_task [req-413136b6-2974-49e8-9fb9-35feae74df7c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:38:15.068 134140 DEBUG oslo_concurrency.lockutils [req-d8103eea-6bf8-4d08-baa3-d8e24e02b76f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:38:15.069 134140 DEBUG oslo_concurrency.lockutils [req-d8103eea-6bf8-4d08-baa3-d8e24e02b76f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:38:16.111 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: aa0702770c794a92a2351f17b49b14d2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:38:16.111 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: aa0702770c794a92a2351f17b49b14d2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:38:16.111 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: aa0702770c794a92a2351f17b49b14d2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:38:16.111 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:16.111 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:16.111 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: aa0702770c794a92a2351f17b49b14d2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:38:16.111 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:16.111 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: aa0702770c794a92a2351f17b49b14d2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:38:16.112 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: aa0702770c794a92a2351f17b49b14d2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:38:16.112 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:16.112 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:16.112 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:16.112 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:16.112 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:16.112 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:16.112 134146 DEBUG oslo_concurrency.lockutils [req-e24062ec-b881-402e-aba6-dbc01046451e - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:38:16.112 134145 DEBUG oslo_concurrency.lockutils [req-e24062ec-b881-402e-aba6-dbc01046451e - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:38:16.112 134138 DEBUG oslo_concurrency.lockutils [req-e24062ec-b881-402e-aba6-dbc01046451e - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:38:16.113 134146 DEBUG nova.scheduler.host_manager [req-e24062ec-b881-402e-aba6-dbc01046451e - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:38:16.113 134145 DEBUG nova.scheduler.host_manager [req-e24062ec-b881-402e-aba6-dbc01046451e - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:38:16.113 134138 DEBUG nova.scheduler.host_manager [req-e24062ec-b881-402e-aba6-dbc01046451e - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:38:16.113 134146 DEBUG oslo_concurrency.lockutils [req-e24062ec-b881-402e-aba6-dbc01046451e - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:38:16.113 134145 DEBUG oslo_concurrency.lockutils [req-e24062ec-b881-402e-aba6-dbc01046451e - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:38:16.113 134138 DEBUG oslo_concurrency.lockutils [req-e24062ec-b881-402e-aba6-dbc01046451e - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:38:16.113 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:16.113 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:16.113 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:16.114 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: aa0702770c794a92a2351f17b49b14d2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:38:16.114 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:16.114 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: aa0702770c794a92a2351f17b49b14d2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:38:16.114 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:16.114 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:16.114 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:16.114 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:16.114 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:16.115 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:16.115 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:16.115 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:16.116 134140 DEBUG oslo_concurrency.lockutils [req-e24062ec-b881-402e-aba6-dbc01046451e - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:38:16.116 134140 DEBUG nova.scheduler.host_manager [req-e24062ec-b881-402e-aba6-dbc01046451e - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:38:16.116 134140 DEBUG oslo_concurrency.lockutils [req-e24062ec-b881-402e-aba6-dbc01046451e - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:38:16.117 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:16.118 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:16.118 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:17.056 134138 DEBUG oslo_service.periodic_task [req-337f95da-7cca-4fc7-83d1-0937253a7e59 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:38:17.060 134138 DEBUG oslo_concurrency.lockutils [req-7b2f298c-a452-40a6-a53e-0f00c9774049 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:38:17.060 134138 DEBUG oslo_concurrency.lockutils [req-7b2f298c-a452-40a6-a53e-0f00c9774049 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:38:17.114 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:17.115 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:17.115 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:17.115 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:17.115 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:17.116 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:17.116 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:17.116 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:17.116 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:17.119 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:17.119 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:17.119 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:19.117 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:19.117 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:19.118 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:19.118 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:19.118 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:19.118 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:19.118 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:19.118 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:19.118 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:19.122 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:19.122 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:19.122 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:23.122 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:23.122 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:23.122 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:23.122 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:23.122 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:23.122 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:23.122 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:23.122 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:23.123 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:23.126 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:23.126 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:23.126 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:24.063 134145 DEBUG oslo_service.periodic_task [req-39e0128a-6a03-4132-901e-0ddd8855ce9d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:38:24.077 134145 DEBUG oslo_concurrency.lockutils [req-2619d6e2-5dd6-4fea-ae55-ebf389936152 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:38:24.078 134145 DEBUG oslo_concurrency.lockutils [req-2619d6e2-5dd6-4fea-ae55-ebf389936152 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:38:31.128 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:31.128 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:31.128 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:31.128 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:31.129 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:31.129 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:31.130 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:31.130 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:31.130 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:31.132 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:31.132 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:31.132 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:39.069 134146 DEBUG oslo_service.periodic_task [req-7c721804-88f2-4cb6-874f-e24a3a128c8c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:38:39.075 134146 DEBUG oslo_concurrency.lockutils [req-033941b9-c396-4e06-9043-2f92caf6380c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:38:39.075 134146 DEBUG oslo_concurrency.lockutils [req-033941b9-c396-4e06-9043-2f92caf6380c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:38:45.082 134140 DEBUG oslo_service.periodic_task [req-d8103eea-6bf8-4d08-baa3-d8e24e02b76f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:38:45.086 134140 DEBUG oslo_concurrency.lockutils [req-da48fc2e-3a5e-4e38-8ac5-2d93f17842ed - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:38:45.086 134140 DEBUG oslo_concurrency.lockutils [req-da48fc2e-3a5e-4e38-8ac5-2d93f17842ed - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:38:47.129 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:47.130 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:47.130 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:47.130 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:47.130 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:47.130 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:47.132 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:47.133 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:47.133 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:47.135 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:38:47.135 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:38:47.135 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:38:48.020 134138 DEBUG oslo_service.periodic_task [req-7b2f298c-a452-40a6-a53e-0f00c9774049 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:38:48.024 134138 DEBUG oslo_concurrency.lockutils [req-30dbc32a-9657-4393-a254-128dd6a7e588 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:38:48.025 134138 DEBUG oslo_concurrency.lockutils [req-30dbc32a-9657-4393-a254-128dd6a7e588 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:38:55.052 134145 DEBUG oslo_service.periodic_task [req-2619d6e2-5dd6-4fea-ae55-ebf389936152 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:38:55.057 134145 DEBUG oslo_concurrency.lockutils [req-1a6ab489-00a4-4950-9f1b-596286dea3ad - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:38:55.057 134145 DEBUG oslo_concurrency.lockutils [req-1a6ab489-00a4-4950-9f1b-596286dea3ad - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:39:09.083 134146 DEBUG oslo_service.periodic_task [req-033941b9-c396-4e06-9043-2f92caf6380c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:39:09.087 134146 DEBUG oslo_concurrency.lockutils [req-25032bf0-3d2b-4993-8957-d35bdb8d52a1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:39:09.087 134146 DEBUG oslo_concurrency.lockutils [req-25032bf0-3d2b-4993-8957-d35bdb8d52a1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:39:16.051 134140 DEBUG oslo_service.periodic_task [req-da48fc2e-3a5e-4e38-8ac5-2d93f17842ed - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:39:16.056 134140 DEBUG oslo_concurrency.lockutils [req-86c18360-455b-4d93-ba59-73193c93fe9d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:39:16.056 134140 DEBUG oslo_concurrency.lockutils [req-86c18360-455b-4d93-ba59-73193c93fe9d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:39:18.031 134138 DEBUG oslo_service.periodic_task [req-30dbc32a-9657-4393-a254-128dd6a7e588 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:39:18.035 134138 DEBUG oslo_concurrency.lockutils [req-19e859e4-db53-4828-a87f-782790dd144f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:39:18.036 134138 DEBUG oslo_concurrency.lockutils [req-19e859e4-db53-4828-a87f-782790dd144f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:39:25.068 134145 DEBUG oslo_service.periodic_task [req-1a6ab489-00a4-4950-9f1b-596286dea3ad - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:39:25.075 134145 DEBUG oslo_concurrency.lockutils [req-13cc14af-ade4-4c25-a0f6-2f45afc09e95 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:39:25.075 134145 DEBUG oslo_concurrency.lockutils [req-13cc14af-ade4-4c25-a0f6-2f45afc09e95 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:39:33.102 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:39:33.102 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:39:33.103 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:39:33.144 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:39:33.144 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:39:33.144 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:39:33.149 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:39:33.149 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:39:33.149 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:39:33.171 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:39:33.171 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:39:33.172 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:39:39.094 134146 DEBUG oslo_service.periodic_task [req-25032bf0-3d2b-4993-8957-d35bdb8d52a1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:39:39.099 134146 DEBUG oslo_concurrency.lockutils [req-49fb24e1-5d65-4d57-8ff2-ee233448ebfc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:39:39.099 134146 DEBUG oslo_concurrency.lockutils [req-49fb24e1-5d65-4d57-8ff2-ee233448ebfc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:39:46.061 134140 DEBUG oslo_service.periodic_task [req-86c18360-455b-4d93-ba59-73193c93fe9d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:39:46.066 134140 DEBUG oslo_concurrency.lockutils [req-6b544e8a-95f8-4bce-91b0-f87ce4cdddd4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:39:46.066 134140 DEBUG oslo_concurrency.lockutils [req-6b544e8a-95f8-4bce-91b0-f87ce4cdddd4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:39:48.043 134138 DEBUG oslo_service.periodic_task [req-19e859e4-db53-4828-a87f-782790dd144f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:39:48.048 134138 DEBUG oslo_concurrency.lockutils [req-8ae67b66-2643-41cf-b8d5-06350e0748c3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:39:48.048 134138 DEBUG oslo_concurrency.lockutils [req-8ae67b66-2643-41cf-b8d5-06350e0748c3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:39:55.087 134145 DEBUG oslo_service.periodic_task [req-13cc14af-ade4-4c25-a0f6-2f45afc09e95 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:39:55.092 134145 DEBUG oslo_concurrency.lockutils [req-88dc6380-ef5b-46d8-a9a9-2d9900e2fe33 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:39:55.092 134145 DEBUG oslo_concurrency.lockutils [req-88dc6380-ef5b-46d8-a9a9-2d9900e2fe33 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:40:10.069 134146 DEBUG oslo_service.periodic_task [req-49fb24e1-5d65-4d57-8ff2-ee233448ebfc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:40:10.074 134146 DEBUG oslo_concurrency.lockutils [req-72329a4a-1cd5-4e6b-b9ee-19c5a470b901 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:40:10.074 134146 DEBUG oslo_concurrency.lockutils [req-72329a4a-1cd5-4e6b-b9ee-19c5a470b901 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:40:16.074 134140 DEBUG oslo_service.periodic_task [req-6b544e8a-95f8-4bce-91b0-f87ce4cdddd4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:40:16.078 134140 DEBUG oslo_concurrency.lockutils [req-fc9b1502-a131-4130-b6eb-02fb0269b963 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:40:16.078 134140 DEBUG oslo_concurrency.lockutils [req-fc9b1502-a131-4130-b6eb-02fb0269b963 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:40:18.058 134138 DEBUG oslo_service.periodic_task [req-8ae67b66-2643-41cf-b8d5-06350e0748c3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:40:18.062 134138 DEBUG oslo_concurrency.lockutils [req-6de957a6-c369-4da9-a2ac-f0298485cbb1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:40:18.063 134138 DEBUG oslo_concurrency.lockutils [req-6de957a6-c369-4da9-a2ac-f0298485cbb1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:40:21.053 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1dfb3b189e9b4e719f29be6ec0c3aec3 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:40:21.053 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1dfb3b189e9b4e719f29be6ec0c3aec3 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:40:21.053 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1dfb3b189e9b4e719f29be6ec0c3aec3 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:40:21.053 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:21.053 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1dfb3b189e9b4e719f29be6ec0c3aec3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:40:21.053 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:21.053 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:21.053 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1dfb3b189e9b4e719f29be6ec0c3aec3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:40:21.053 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1dfb3b189e9b4e719f29be6ec0c3aec3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:40:21.053 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:21.054 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:21.054 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:21.054 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:21.054 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:21.054 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:21.054 134145 DEBUG oslo_concurrency.lockutils [req-5a1a5900-698a-41dc-bb96-2b68edfdaa11 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:40:21.054 134146 DEBUG oslo_concurrency.lockutils [req-5a1a5900-698a-41dc-bb96-2b68edfdaa11 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:40:21.055 134145 DEBUG nova.scheduler.host_manager [req-5a1a5900-698a-41dc-bb96-2b68edfdaa11 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:40:21.054 134138 DEBUG oslo_concurrency.lockutils [req-5a1a5900-698a-41dc-bb96-2b68edfdaa11 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:40:21.055 134146 DEBUG nova.scheduler.host_manager [req-5a1a5900-698a-41dc-bb96-2b68edfdaa11 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:40:21.055 134145 DEBUG oslo_concurrency.lockutils [req-5a1a5900-698a-41dc-bb96-2b68edfdaa11 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:40:21.055 134138 DEBUG nova.scheduler.host_manager [req-5a1a5900-698a-41dc-bb96-2b68edfdaa11 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:40:21.055 134146 DEBUG oslo_concurrency.lockutils [req-5a1a5900-698a-41dc-bb96-2b68edfdaa11 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:40:21.055 134138 DEBUG oslo_concurrency.lockutils [req-5a1a5900-698a-41dc-bb96-2b68edfdaa11 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:40:21.055 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:21.055 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:21.055 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:21.055 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:21.055 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:21.055 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:21.055 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:21.055 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:21.056 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:21.056 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1dfb3b189e9b4e719f29be6ec0c3aec3 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:40:21.057 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:21.057 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1dfb3b189e9b4e719f29be6ec0c3aec3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:40:21.057 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:21.057 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:21.058 134140 DEBUG oslo_concurrency.lockutils [req-5a1a5900-698a-41dc-bb96-2b68edfdaa11 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:40:21.058 134140 DEBUG nova.scheduler.host_manager [req-5a1a5900-698a-41dc-bb96-2b68edfdaa11 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:40:21.058 134140 DEBUG oslo_concurrency.lockutils [req-5a1a5900-698a-41dc-bb96-2b68edfdaa11 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:40:21.059 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:21.059 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:21.059 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:22.057 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:22.057 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:22.057 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:22.057 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:22.057 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:22.057 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:22.057 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:22.057 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:22.057 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:22.060 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:22.060 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:22.061 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:24.059 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:24.059 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:24.059 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:24.059 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:24.060 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:24.060 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:24.060 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:24.060 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:24.060 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:24.063 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:24.063 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:24.063 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:26.052 134145 DEBUG oslo_service.periodic_task [req-88dc6380-ef5b-46d8-a9a9-2d9900e2fe33 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:40:26.057 134145 DEBUG oslo_concurrency.lockutils [req-8ec8e4c1-b056-4270-8f3c-33f86d501638 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:40:26.057 134145 DEBUG oslo_concurrency.lockutils [req-8ec8e4c1-b056-4270-8f3c-33f86d501638 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:40:28.062 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:28.062 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:28.062 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:28.062 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:28.062 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:28.062 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:28.063 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:28.063 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:28.063 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:28.066 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:28.066 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:28.066 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:36.063 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:36.064 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:36.064 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:36.064 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:36.064 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:36.064 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:36.064 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:36.065 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:36.065 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:36.068 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:36.068 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:36.068 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:41.070 134146 DEBUG oslo_service.periodic_task [req-72329a4a-1cd5-4e6b-b9ee-19c5a470b901 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:40:41.074 134146 DEBUG oslo_concurrency.lockutils [req-bae60a88-c87d-4106-afa3-5bff68132553 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:40:41.074 134146 DEBUG oslo_concurrency.lockutils [req-bae60a88-c87d-4106-afa3-5bff68132553 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:40:46.083 134140 DEBUG oslo_service.periodic_task [req-fc9b1502-a131-4130-b6eb-02fb0269b963 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:40:46.087 134140 DEBUG oslo_concurrency.lockutils [req-c968b2ff-015b-4e45-9a07-b05503b7d414 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:40:46.088 134140 DEBUG oslo_concurrency.lockutils [req-c968b2ff-015b-4e45-9a07-b05503b7d414 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:40:48.070 134138 DEBUG oslo_service.periodic_task [req-6de957a6-c369-4da9-a2ac-f0298485cbb1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:40:48.074 134138 DEBUG oslo_concurrency.lockutils [req-b2ce5529-3f45-4eb8-8215-5244b359df89 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:40:48.074 134138 DEBUG oslo_concurrency.lockutils [req-b2ce5529-3f45-4eb8-8215-5244b359df89 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:40:52.066 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:52.066 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:52.066 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:52.066 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:52.066 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:52.066 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:52.066 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:52.067 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:52.067 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:52.070 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:40:52.070 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:40:52.070 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:40:56.062 134145 DEBUG oslo_service.periodic_task [req-8ec8e4c1-b056-4270-8f3c-33f86d501638 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:40:56.066 134145 DEBUG oslo_concurrency.lockutils [req-b9cceaf5-5ee8-4e90-a332-ee47be3069bd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:40:56.066 134145 DEBUG oslo_concurrency.lockutils [req-b9cceaf5-5ee8-4e90-a332-ee47be3069bd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:41:11.082 134146 DEBUG oslo_service.periodic_task [req-bae60a88-c87d-4106-afa3-5bff68132553 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:41:11.086 134146 DEBUG oslo_concurrency.lockutils [req-21b690ac-4064-431d-a5a4-764a010502b2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:41:11.087 134146 DEBUG oslo_concurrency.lockutils [req-21b690ac-4064-431d-a5a4-764a010502b2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:41:17.051 134140 DEBUG oslo_service.periodic_task [req-c968b2ff-015b-4e45-9a07-b05503b7d414 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:41:17.056 134140 DEBUG oslo_concurrency.lockutils [req-dfc4e9bd-5555-4684-bc45-897183136d70 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:41:17.056 134140 DEBUG oslo_concurrency.lockutils [req-dfc4e9bd-5555-4684-bc45-897183136d70 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:41:18.082 134138 DEBUG oslo_service.periodic_task [req-b2ce5529-3f45-4eb8-8215-5244b359df89 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:41:18.086 134138 DEBUG oslo_concurrency.lockutils [req-350e994f-42b4-41d0-a802-a94094dbdbd7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:41:18.086 134138 DEBUG oslo_concurrency.lockutils [req-350e994f-42b4-41d0-a802-a94094dbdbd7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:41:27.051 134145 DEBUG oslo_service.periodic_task [req-b9cceaf5-5ee8-4e90-a332-ee47be3069bd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:41:27.055 134145 DEBUG oslo_concurrency.lockutils [req-3ff1c4bb-80e5-4a7f-809e-833bd0232c6e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:41:27.056 134145 DEBUG oslo_concurrency.lockutils [req-3ff1c4bb-80e5-4a7f-809e-833bd0232c6e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:41:33.110 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:41:33.111 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:41:33.111 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:41:33.148 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:41:33.148 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:41:33.148 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:41:33.150 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:41:33.150 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:41:33.150 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:41:33.176 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:41:33.176 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:41:33.176 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:41:41.100 134146 DEBUG oslo_service.periodic_task [req-21b690ac-4064-431d-a5a4-764a010502b2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:41:41.105 134146 DEBUG oslo_concurrency.lockutils [req-7e419cff-b836-424c-80ac-36136fdd3d32 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:41:41.105 134146 DEBUG oslo_concurrency.lockutils [req-7e419cff-b836-424c-80ac-36136fdd3d32 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:41:48.052 134140 DEBUG oslo_service.periodic_task [req-dfc4e9bd-5555-4684-bc45-897183136d70 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:41:48.057 134140 DEBUG oslo_concurrency.lockutils [req-08e3af87-0f27-4cda-87fa-44ac94d30aa0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:41:48.058 134140 DEBUG oslo_concurrency.lockutils [req-08e3af87-0f27-4cda-87fa-44ac94d30aa0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:41:48.094 134138 DEBUG oslo_service.periodic_task [req-350e994f-42b4-41d0-a802-a94094dbdbd7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:41:48.099 134138 DEBUG oslo_concurrency.lockutils [req-c4d56df6-7b60-4b8f-a6ca-e48d9c767981 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:41:48.099 134138 DEBUG oslo_concurrency.lockutils [req-c4d56df6-7b60-4b8f-a6ca-e48d9c767981 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:41:57.061 134145 DEBUG oslo_service.periodic_task [req-3ff1c4bb-80e5-4a7f-809e-833bd0232c6e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:41:57.066 134145 DEBUG oslo_concurrency.lockutils [req-61287686-52f6-43c8-9d75-eda62e2ce7a6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:41:57.066 134145 DEBUG oslo_concurrency.lockutils [req-61287686-52f6-43c8-9d75-eda62e2ce7a6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:42:11.114 134146 DEBUG oslo_service.periodic_task [req-7e419cff-b836-424c-80ac-36136fdd3d32 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:42:11.121 134146 DEBUG oslo_concurrency.lockutils [req-5c66be31-d08e-4efd-852d-5a1b176e1a37 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:42:11.121 134146 DEBUG oslo_concurrency.lockutils [req-5c66be31-d08e-4efd-852d-5a1b176e1a37 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:42:18.066 134140 DEBUG oslo_service.periodic_task [req-08e3af87-0f27-4cda-87fa-44ac94d30aa0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:42:18.071 134140 DEBUG oslo_concurrency.lockutils [req-449fe60e-1b08-4c95-86a9-4f09b74a7991 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:42:18.074 134140 DEBUG oslo_concurrency.lockutils [req-449fe60e-1b08-4c95-86a9-4f09b74a7991 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:42:18.104 134138 DEBUG oslo_service.periodic_task [req-c4d56df6-7b60-4b8f-a6ca-e48d9c767981 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:42:18.109 134138 DEBUG oslo_concurrency.lockutils [req-6c9bb812-b765-4b09-8cfd-94527c008c97 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:42:18.109 134138 DEBUG oslo_concurrency.lockutils [req-6c9bb812-b765-4b09-8cfd-94527c008c97 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:42:22.892 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f1190df54db84ca0b25ab47ba45fd7b3 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:42:22.892 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f1190df54db84ca0b25ab47ba45fd7b3 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:42:22.892 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f1190df54db84ca0b25ab47ba45fd7b3 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:42:22.892 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f1190df54db84ca0b25ab47ba45fd7b3 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:42:22.892 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:22.892 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:22.892 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:22.892 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f1190df54db84ca0b25ab47ba45fd7b3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:42:22.892 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f1190df54db84ca0b25ab47ba45fd7b3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:42:22.892 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f1190df54db84ca0b25ab47ba45fd7b3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:42:22.892 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:22.892 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:22.892 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:22.892 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:22.893 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:22.893 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:22.893 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f1190df54db84ca0b25ab47ba45fd7b3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:42:22.893 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:22.893 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:22.893 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:22.893 134140 DEBUG oslo_concurrency.lockutils [req-46f32cc2-c22f-48c9-bed6-c2060ffc1458 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:42:22.893 134145 DEBUG oslo_concurrency.lockutils [req-46f32cc2-c22f-48c9-bed6-c2060ffc1458 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:42:22.893 134146 DEBUG oslo_concurrency.lockutils [req-46f32cc2-c22f-48c9-bed6-c2060ffc1458 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:42:22.893 134145 DEBUG nova.scheduler.host_manager [req-46f32cc2-c22f-48c9-bed6-c2060ffc1458 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:42:22.893 134140 DEBUG nova.scheduler.host_manager [req-46f32cc2-c22f-48c9-bed6-c2060ffc1458 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:42:22.894 134146 DEBUG nova.scheduler.host_manager [req-46f32cc2-c22f-48c9-bed6-c2060ffc1458 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:42:22.894 134138 DEBUG oslo_concurrency.lockutils [req-46f32cc2-c22f-48c9-bed6-c2060ffc1458 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:42:22.894 134145 DEBUG oslo_concurrency.lockutils [req-46f32cc2-c22f-48c9-bed6-c2060ffc1458 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:42:22.894 134140 DEBUG oslo_concurrency.lockutils [req-46f32cc2-c22f-48c9-bed6-c2060ffc1458 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:42:22.894 134138 DEBUG nova.scheduler.host_manager [req-46f32cc2-c22f-48c9-bed6-c2060ffc1458 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:42:22.894 134146 DEBUG oslo_concurrency.lockutils [req-46f32cc2-c22f-48c9-bed6-c2060ffc1458 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:42:22.894 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:22.894 134138 DEBUG oslo_concurrency.lockutils [req-46f32cc2-c22f-48c9-bed6-c2060ffc1458 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:42:22.894 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:22.894 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:22.894 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:22.894 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:22.894 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:22.894 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:22.894 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:22.894 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:22.894 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:22.895 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:22.895 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:23.895 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:23.895 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:23.895 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:23.895 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:23.895 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:23.895 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:23.896 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:23.896 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:23.896 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:23.896 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:23.896 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:23.896 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:25.896 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:25.897 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:25.897 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:25.897 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:25.897 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:25.897 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:25.897 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:25.898 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:25.898 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:25.898 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:25.898 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:25.898 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:28.053 134145 DEBUG oslo_service.periodic_task [req-61287686-52f6-43c8-9d75-eda62e2ce7a6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:42:28.058 134145 DEBUG oslo_concurrency.lockutils [req-302ab4da-f1b2-4678-a03e-7241a9181215 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:42:28.058 134145 DEBUG oslo_concurrency.lockutils [req-302ab4da-f1b2-4678-a03e-7241a9181215 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:42:29.898 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:29.898 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:29.898 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:29.900 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:29.900 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:29.900 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:29.902 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:29.902 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:29.902 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:29.902 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:29.902 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:29.902 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:37.900 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:37.900 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:37.900 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:37.903 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:37.903 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:37.903 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:37.904 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:37.905 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:37.905 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:37.906 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:37.907 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:37.908 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:41.128 134146 DEBUG oslo_service.periodic_task [req-5c66be31-d08e-4efd-852d-5a1b176e1a37 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:42:41.134 134146 DEBUG oslo_concurrency.lockutils [req-2b170231-d96f-46ef-91b5-6b3cd3a28a90 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:42:41.135 134146 DEBUG oslo_concurrency.lockutils [req-2b170231-d96f-46ef-91b5-6b3cd3a28a90 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:42:48.086 134140 DEBUG oslo_service.periodic_task [req-449fe60e-1b08-4c95-86a9-4f09b74a7991 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:42:48.092 134140 DEBUG oslo_concurrency.lockutils [req-fece3a40-8fa5-40ea-86d5-1574ee644db8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:42:48.093 134140 DEBUG oslo_concurrency.lockutils [req-fece3a40-8fa5-40ea-86d5-1574ee644db8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:42:48.116 134138 DEBUG oslo_service.periodic_task [req-6c9bb812-b765-4b09-8cfd-94527c008c97 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:42:48.129 134138 DEBUG oslo_concurrency.lockutils [req-53f8e35e-94df-4534-bae2-bb53df7ad3d9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:42:48.129 134138 DEBUG oslo_concurrency.lockutils [req-53f8e35e-94df-4534-bae2-bb53df7ad3d9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:42:53.902 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:53.902 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:53.902 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:53.905 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:53.906 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:53.906 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:53.906 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:53.906 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:53.906 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:53.909 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:42:53.909 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:42:53.909 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:42:58.065 134145 DEBUG oslo_service.periodic_task [req-302ab4da-f1b2-4678-a03e-7241a9181215 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:42:58.072 134145 DEBUG oslo_concurrency.lockutils [req-3b8763dc-ff2f-43fe-86c5-89109ceb2792 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:42:58.072 134145 DEBUG oslo_concurrency.lockutils [req-3b8763dc-ff2f-43fe-86c5-89109ceb2792 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:43:12.069 134146 DEBUG oslo_service.periodic_task [req-2b170231-d96f-46ef-91b5-6b3cd3a28a90 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:43:12.073 134146 DEBUG oslo_concurrency.lockutils [req-6b73daf9-821a-4031-abd2-f1036cd12112 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:43:12.073 134146 DEBUG oslo_concurrency.lockutils [req-6b73daf9-821a-4031-abd2-f1036cd12112 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:43:18.137 134138 DEBUG oslo_service.periodic_task [req-53f8e35e-94df-4534-bae2-bb53df7ad3d9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:43:18.141 134138 DEBUG oslo_concurrency.lockutils [req-82167a38-9175-4c8f-aabc-58634fd03e6a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:43:18.141 134138 DEBUG oslo_concurrency.lockutils [req-82167a38-9175-4c8f-aabc-58634fd03e6a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:43:19.052 134140 DEBUG oslo_service.periodic_task [req-fece3a40-8fa5-40ea-86d5-1574ee644db8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:43:19.057 134140 DEBUG oslo_concurrency.lockutils [req-057d854a-7112-4c0c-bed1-05bb78d21d00 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:43:19.057 134140 DEBUG oslo_concurrency.lockutils [req-057d854a-7112-4c0c-bed1-05bb78d21d00 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:43:29.052 134145 DEBUG oslo_service.periodic_task [req-3b8763dc-ff2f-43fe-86c5-89109ceb2792 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:43:29.057 134145 DEBUG oslo_concurrency.lockutils [req-7c30a40e-3540-4ae8-9595-e5823f58469f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:43:29.057 134145 DEBUG oslo_concurrency.lockutils [req-7c30a40e-3540-4ae8-9595-e5823f58469f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:43:33.115 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:43:33.115 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:43:33.115 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:43:33.150 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:43:33.150 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:43:33.150 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:43:33.150 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:43:33.150 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:43:33.150 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:43:33.180 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:43:33.181 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:43:33.181 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:43:42.083 134146 DEBUG oslo_service.periodic_task [req-6b73daf9-821a-4031-abd2-f1036cd12112 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:43:42.087 134146 DEBUG oslo_concurrency.lockutils [req-0f7ff4fd-e87d-47e2-b9df-bdac49295f0f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:43:42.088 134146 DEBUG oslo_concurrency.lockutils [req-0f7ff4fd-e87d-47e2-b9df-bdac49295f0f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:43:48.146 134138 DEBUG oslo_service.periodic_task [req-82167a38-9175-4c8f-aabc-58634fd03e6a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:43:48.165 134138 DEBUG oslo_concurrency.lockutils [req-48cd36a0-0f0e-4956-8e46-016d2ebb9498 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:43:48.165 134138 DEBUG oslo_concurrency.lockutils [req-48cd36a0-0f0e-4956-8e46-016d2ebb9498 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:43:50.052 134140 DEBUG oslo_service.periodic_task [req-057d854a-7112-4c0c-bed1-05bb78d21d00 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:43:50.057 134140 DEBUG oslo_concurrency.lockutils [req-6344a405-fd08-4963-a379-61f89d589ed0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:43:50.057 134140 DEBUG oslo_concurrency.lockutils [req-6344a405-fd08-4963-a379-61f89d589ed0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:00.052 134145 DEBUG oslo_service.periodic_task [req-7c30a40e-3540-4ae8-9595-e5823f58469f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:44:00.056 134145 DEBUG oslo_concurrency.lockutils [req-9c770934-7ff1-4903-a234-60811b2d8739 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:00.057 134145 DEBUG oslo_concurrency.lockutils [req-9c770934-7ff1-4903-a234-60811b2d8739 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:03.658 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 38337f464c6c454abd1e674b6b752bba reply to reply_b20bc86560af44159724b4fea607f8c2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:44:03.659 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:03.659 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 48d8954bd04c4f6999c57e42456f25be poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:03.659 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:03.659 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:03.662 134138 DEBUG nova.scheduler.manager [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['19611b92-1a90-4966-9a50-22bf92d2457d'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:44:03.664 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:03.664 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:03.664 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:03.671 134138 DEBUG nova.scheduler.request_filter [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:44:03.671 134138 DEBUG nova.scheduler.request_filter [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:44:03.671 134138 DEBUG nova.scheduler.request_filter [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:44:03.672 134138 DEBUG nova.scheduler.request_filter [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:44:03.672 134138 DEBUG nova.scheduler.request_filter [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:44:03.684 134138 DEBUG oslo_concurrency.lockutils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:03.685 134138 DEBUG oslo_concurrency.lockutils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:04.153 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 66b852c324654b1e88c13864f23389c4 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:44:04.158 134138 DEBUG oslo_concurrency.lockutils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:04.159 134138 DEBUG oslo_concurrency.lockutils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:04.174 134138 DEBUG oslo_concurrency.lockutils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:04.174 134138 DEBUG nova.scheduler.host_manager [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:43:24Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:44:04.176 134138 DEBUG nova.scheduler.host_manager [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:44:04.176 134138 DEBUG nova.scheduler.host_manager [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 370, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 44, 2, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 44, 2, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:44:04.177 134138 DEBUG nova.scheduler.host_manager [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:44:04.177 134138 DEBUG oslo_concurrency.lockutils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:04.177 134138 INFO nova.scheduler.host_manager [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:44:04.177 134138 DEBUG nova.scheduler.manager [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:44:04.178 134138 DEBUG nova.scheduler.manager [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:44:04.178 134138 DEBUG nova.scheduler.utils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance 19611b92-1a90-4966-9a50-22bf92d2457d claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:44:04.288 134138 DEBUG nova.scheduler.manager [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: 19611b92-1a90-4966-9a50-22bf92d2457d] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:44:04.289 134138 DEBUG oslo_concurrency.lockutils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:04.289 134138 DEBUG oslo_concurrency.lockutils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:04.291 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: addfa4cb96d54ac8995f6a6aff681d77 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:44:04.297 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 38337f464c6c454abd1e674b6b752bba reply queue: reply_b20bc86560af44159724b4fea607f8c2 time elapsed: 0.6385864989997572s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:44:04.666 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:04.666 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:04.666 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:06.668 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:06.668 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:06.668 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:10.670 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:10.671 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:10.671 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:12.094 134146 DEBUG oslo_service.periodic_task [req-0f7ff4fd-e87d-47e2-b9df-bdac49295f0f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:44:12.098 134146 DEBUG oslo_concurrency.lockutils [req-c04abaa8-ea9f-475b-96ac-3edef361ca9c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:12.098 134146 DEBUG oslo_concurrency.lockutils [req-c04abaa8-ea9f-475b-96ac-3edef361ca9c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:12.261 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fd4a2d5a0048427f8e04fc38d53c934b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:12.261 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fd4a2d5a0048427f8e04fc38d53c934b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:12.261 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fd4a2d5a0048427f8e04fc38d53c934b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:12.262 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:12.262 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:12.262 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fd4a2d5a0048427f8e04fc38d53c934b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:12.262 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:12.262 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fd4a2d5a0048427f8e04fc38d53c934b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:12.262 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fd4a2d5a0048427f8e04fc38d53c934b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:12.262 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:12.262 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:12.262 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:12.262 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:12.262 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:12.262 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:12.265 134140 DEBUG oslo_concurrency.lockutils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:12.266 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fd4a2d5a0048427f8e04fc38d53c934b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:12.266 134140 DEBUG oslo_concurrency.lockutils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:12.266 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:12.266 134145 DEBUG oslo_concurrency.lockutils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:12.266 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fd4a2d5a0048427f8e04fc38d53c934b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:12.266 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:12.267 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:12.266 134145 DEBUG oslo_concurrency.lockutils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:12.267 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:12.266 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:12.267 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:12.267 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:12.267 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:12.265 134146 DEBUG oslo_concurrency.lockutils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:12.267 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:12.267 134146 DEBUG oslo_concurrency.lockutils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:12.268 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:12.268 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:12.268 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:12.270 134138 DEBUG oslo_concurrency.lockutils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:12.271 134138 DEBUG oslo_concurrency.lockutils [req-ecc88289-7bb1-44f9-bd6f-d0a8b2b2ca1b abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:12.271 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:12.271 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:12.271 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:13.268 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:13.269 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:13.269 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:13.269 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:13.269 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:13.269 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:13.270 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:13.270 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:13.270 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:13.273 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:13.273 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:13.273 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:15.270 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:15.270 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:15.270 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:15.271 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:15.272 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:15.272 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:15.272 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:15.273 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:15.273 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:15.274 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:15.274 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:15.274 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:18.170 134138 DEBUG oslo_service.periodic_task [req-48cd36a0-0f0e-4956-8e46-016d2ebb9498 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:44:18.175 134138 DEBUG oslo_concurrency.lockutils [req-1f6a7744-7355-4bac-88b1-e78d498ccae6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:18.175 134138 DEBUG oslo_concurrency.lockutils [req-1f6a7744-7355-4bac-88b1-e78d498ccae6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:19.273 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:19.273 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:19.273 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:19.273 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:19.274 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:19.274 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:19.275 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:19.275 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:19.275 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:19.275 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:19.275 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:19.275 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:20.062 134140 DEBUG oslo_service.periodic_task [req-6344a405-fd08-4963-a379-61f89d589ed0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:44:20.066 134140 DEBUG oslo_concurrency.lockutils [req-a3172db4-847d-4fd2-8128-8c1ee5bd0b74 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:20.067 134140 DEBUG oslo_concurrency.lockutils [req-a3172db4-847d-4fd2-8128-8c1ee5bd0b74 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:24.087 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 41dee81a5f264102bfe35fac0ba182c1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:24.087 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 41dee81a5f264102bfe35fac0ba182c1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:24.087 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 41dee81a5f264102bfe35fac0ba182c1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:24.087 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 41dee81a5f264102bfe35fac0ba182c1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:24.087 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:24.088 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:24.088 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:24.088 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:24.088 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 41dee81a5f264102bfe35fac0ba182c1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:24.088 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 41dee81a5f264102bfe35fac0ba182c1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:24.088 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 41dee81a5f264102bfe35fac0ba182c1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:24.088 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 41dee81a5f264102bfe35fac0ba182c1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:24.088 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:24.088 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:24.088 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:24.088 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:24.088 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:24.088 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:24.088 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:24.088 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:24.089 134145 DEBUG oslo_concurrency.lockutils [req-1b62397c-08a8-4d7f-878f-3c0873b70084 abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:24.089 134146 DEBUG oslo_concurrency.lockutils [req-1b62397c-08a8-4d7f-878f-3c0873b70084 abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:24.089 134138 DEBUG oslo_concurrency.lockutils [req-1b62397c-08a8-4d7f-878f-3c0873b70084 abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:24.089 134140 DEBUG oslo_concurrency.lockutils [req-1b62397c-08a8-4d7f-878f-3c0873b70084 abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:24.089 134138 DEBUG oslo_concurrency.lockutils [req-1b62397c-08a8-4d7f-878f-3c0873b70084 abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:24.089 134146 DEBUG oslo_concurrency.lockutils [req-1b62397c-08a8-4d7f-878f-3c0873b70084 abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:24.089 134145 DEBUG oslo_concurrency.lockutils [req-1b62397c-08a8-4d7f-878f-3c0873b70084 abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:24.089 134140 DEBUG oslo_concurrency.lockutils [req-1b62397c-08a8-4d7f-878f-3c0873b70084 abbd6084a868489195ba9a44331ab449 a974891ad5b7461895d9e2eca7350a57 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:24.090 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:24.090 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:24.090 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:24.091 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:24.090 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:24.091 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:24.091 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:24.091 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:24.091 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:24.091 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:24.091 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:24.091 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:25.091 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:25.092 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:25.092 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:25.092 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:25.092 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:25.092 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:25.092 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:25.092 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:25.092 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:25.093 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:25.093 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:25.093 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:25.151 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ca65a3b681cb4d13ba586ee9644afb20 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:25.151 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ca65a3b681cb4d13ba586ee9644afb20 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:25.151 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ca65a3b681cb4d13ba586ee9644afb20 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:25.152 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:25.151 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ca65a3b681cb4d13ba586ee9644afb20 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:25.152 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:25.152 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ca65a3b681cb4d13ba586ee9644afb20 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:25.152 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:25.152 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ca65a3b681cb4d13ba586ee9644afb20 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:25.152 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ca65a3b681cb4d13ba586ee9644afb20 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:25.152 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:25.152 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:25.152 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:25.152 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:25.152 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:25.152 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:25.152 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ca65a3b681cb4d13ba586ee9644afb20 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:25.152 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:25.152 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:25.152 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:25.153 134145 DEBUG oslo_concurrency.lockutils [req-fbbf299a-6c3b-4502-bc93-8df7ee720368 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:25.153 134138 DEBUG oslo_concurrency.lockutils [req-fbbf299a-6c3b-4502-bc93-8df7ee720368 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:25.153 134146 DEBUG oslo_concurrency.lockutils [req-fbbf299a-6c3b-4502-bc93-8df7ee720368 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:25.153 134145 DEBUG nova.scheduler.host_manager [req-fbbf299a-6c3b-4502-bc93-8df7ee720368 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:44:25.153 134138 DEBUG nova.scheduler.host_manager [req-fbbf299a-6c3b-4502-bc93-8df7ee720368 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:44:25.153 134138 DEBUG oslo_concurrency.lockutils [req-fbbf299a-6c3b-4502-bc93-8df7ee720368 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:25.153 134146 DEBUG nova.scheduler.host_manager [req-fbbf299a-6c3b-4502-bc93-8df7ee720368 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:44:25.153 134145 DEBUG oslo_concurrency.lockutils [req-fbbf299a-6c3b-4502-bc93-8df7ee720368 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:25.153 134140 DEBUG oslo_concurrency.lockutils [req-fbbf299a-6c3b-4502-bc93-8df7ee720368 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:25.153 134146 DEBUG oslo_concurrency.lockutils [req-fbbf299a-6c3b-4502-bc93-8df7ee720368 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:25.153 134140 DEBUG nova.scheduler.host_manager [req-fbbf299a-6c3b-4502-bc93-8df7ee720368 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:44:25.153 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:25.153 134140 DEBUG oslo_concurrency.lockutils [req-fbbf299a-6c3b-4502-bc93-8df7ee720368 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:25.153 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:25.154 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:25.154 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:25.154 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:25.154 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:25.154 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:25.154 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:25.154 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:25.155 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:25.155 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:25.155 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:26.154 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:26.155 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:26.155 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:26.156 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:26.156 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:26.156 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:26.156 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:26.156 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:26.156 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:26.156 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:26.156 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:26.157 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:28.156 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:28.157 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:28.157 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:28.157 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:28.157 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:28.158 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:28.158 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:28.158 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:28.158 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:28.158 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:28.158 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:28.158 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:31.051 134145 DEBUG oslo_service.periodic_task [req-9c770934-7ff1-4903-a234-60811b2d8739 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:44:31.056 134145 DEBUG oslo_concurrency.lockutils [req-4477c673-18dc-42e0-ab17-ef8069fb0aa8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:31.056 134145 DEBUG oslo_concurrency.lockutils [req-4477c673-18dc-42e0-ab17-ef8069fb0aa8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:32.158 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:32.158 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:32.158 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:32.162 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:32.162 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:32.162 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:32.162 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:32.162 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:32.162 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:32.163 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:32.163 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:32.163 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:40.162 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:40.162 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:40.162 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:40.167 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:40.167 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:40.168 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:40.168 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:40.168 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:40.169 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:40.174 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:40.174 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:40.174 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:43.069 134146 DEBUG oslo_service.periodic_task [req-c04abaa8-ea9f-475b-96ac-3edef361ca9c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:44:43.077 134146 DEBUG oslo_concurrency.lockutils [req-af13ef55-adb6-4433-88d6-5e15f3c8e2a9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:43.077 134146 DEBUG oslo_concurrency.lockutils [req-af13ef55-adb6-4433-88d6-5e15f3c8e2a9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:45.399 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 831f58c2118b474f8846d9c2bd1f426a reply to reply_85578e9317f24a84af7e369b8d783622 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:44:45.400 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:45.400 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 894c2005f00046b481111642ffecdd7a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:45.400 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:45.400 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:45.402 134145 DEBUG nova.scheduler.manager [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['e41bb808-3301-4a5b-b7b3-34f64d485d86'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:44:45.405 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:45.405 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:45.405 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:45.413 134145 DEBUG nova.scheduler.request_filter [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:44:45.415 134145 DEBUG nova.scheduler.request_filter [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:44:45.417 134145 DEBUG nova.scheduler.request_filter [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:44:45.418 134145 DEBUG nova.scheduler.request_filter [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:44:45.418 134145 DEBUG nova.scheduler.request_filter [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:44:45.426 134145 DEBUG oslo_concurrency.lockutils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:45.426 134145 DEBUG oslo_concurrency.lockutils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:45.938 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 593d8d2f07f44db4b45eb12f489dabc2 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:44:45.943 134145 DEBUG oslo_concurrency.lockutils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:45.944 134145 DEBUG oslo_concurrency.lockutils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:45.965 134145 DEBUG oslo_concurrency.lockutils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:45.965 134145 DEBUG nova.scheduler.host_manager [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:44:24Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:44:45.967 134145 DEBUG nova.scheduler.host_manager [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:44:45.967 134145 DEBUG nova.scheduler.host_manager [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 374, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 44, 42, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 44, 42, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:44:45.967 134145 DEBUG nova.scheduler.host_manager [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:44:45.967 134145 DEBUG oslo_concurrency.lockutils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:45.968 134145 INFO nova.scheduler.host_manager [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:44:45.968 134145 DEBUG nova.scheduler.manager [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:44:45.968 134145 DEBUG nova.scheduler.manager [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:44:45.969 134145 DEBUG nova.scheduler.utils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance e41bb808-3301-4a5b-b7b3-34f64d485d86 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:44:46.085 134145 DEBUG nova.scheduler.manager [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: e41bb808-3301-4a5b-b7b3-34f64d485d86] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:44:46.086 134145 DEBUG oslo_concurrency.lockutils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:46.087 134145 DEBUG oslo_concurrency.lockutils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:46.088 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: af1eea43af82460183bbf90967906031 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:44:46.095 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 831f58c2118b474f8846d9c2bd1f426a reply queue: reply_85578e9317f24a84af7e369b8d783622 time elapsed: 0.6951331430000209s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:44:46.416 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:46.416 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:46.417 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:46.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 9fd014fc10c64661b25fd7aba3d05fe7 reply to reply_196060a39108420a873d953cb9a1134e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:44:46.832 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:46.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8177d60101d349c4b6f55b07ed66fced poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:46.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:46.833 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:46.836 134140 DEBUG nova.scheduler.manager [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['cc0ca8d1-8759-4fcb-b21e-2136c81ddfb5'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:44:46.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:46.838 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:46.839 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:46.847 134140 DEBUG nova.scheduler.request_filter [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:44:46.848 134140 DEBUG nova.scheduler.request_filter [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:44:46.848 134140 DEBUG nova.scheduler.request_filter [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:44:46.848 134140 DEBUG nova.scheduler.request_filter [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:44:46.849 134140 DEBUG nova.scheduler.request_filter [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:44:46.853 134140 DEBUG oslo_concurrency.lockutils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:46.853 134140 DEBUG oslo_concurrency.lockutils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:47.298 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 710135ccbf684c919f2bddb59148e053 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:44:47.305 134140 DEBUG oslo_concurrency.lockutils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:47.305 134140 DEBUG oslo_concurrency.lockutils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:47.322 134140 DEBUG oslo_concurrency.lockutils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:47.323 134140 DEBUG nova.scheduler.host_manager [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_7d53c701f51740daa18445a75434d879='1',num_task_None='1',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:44:47Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:44:47.324 134140 DEBUG nova.scheduler.host_manager [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:44:47.325 134140 DEBUG nova.scheduler.host_manager [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 374, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 44, 42, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 44, 42, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:44:47.325 134140 DEBUG nova.scheduler.host_manager [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:44:47.325 134140 DEBUG oslo_concurrency.lockutils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:47.326 134140 INFO nova.scheduler.host_manager [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:44:47.326 134140 DEBUG nova.scheduler.manager [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:44:47.326 134140 DEBUG nova.scheduler.manager [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:44:47.326 134140 DEBUG nova.scheduler.utils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance cc0ca8d1-8759-4fcb-b21e-2136c81ddfb5 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:44:47.395 134140 DEBUG nova.scheduler.manager [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: cc0ca8d1-8759-4fcb-b21e-2136c81ddfb5] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:44:47.397 134140 DEBUG oslo_concurrency.lockutils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:47.398 134140 DEBUG oslo_concurrency.lockutils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:47.399 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 44588ba7340a405e97cff5eecc9d3aab NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:44:47.406 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 9fd014fc10c64661b25fd7aba3d05fe7 reply queue: reply_196060a39108420a873d953cb9a1134e time elapsed: 0.5736930519997259s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:44:47.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:47.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:47.840 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:48.183 134138 DEBUG oslo_service.periodic_task [req-1f6a7744-7355-4bac-88b1-e78d498ccae6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:44:48.187 134138 DEBUG oslo_concurrency.lockutils [req-eea4d2d6-3f75-459c-9f92-ba79d3a62f2d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:48.188 134138 DEBUG oslo_concurrency.lockutils [req-eea4d2d6-3f75-459c-9f92-ba79d3a62f2d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:48.418 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:48.419 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:48.419 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:49.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:49.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:49.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:51.054 134140 DEBUG oslo_service.periodic_task [req-a3172db4-847d-4fd2-8128-8c1ee5bd0b74 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:44:51.066 134140 DEBUG oslo_concurrency.lockutils [req-e874cd28-4c07-4fb2-b234-c722bc1a9523 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:51.067 134140 DEBUG oslo_concurrency.lockutils [req-e874cd28-4c07-4fb2-b234-c722bc1a9523 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:51.211 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fff8030cfbed4a64aac58d720775d6c8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:51.211 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fff8030cfbed4a64aac58d720775d6c8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:51.211 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.211 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.211 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fff8030cfbed4a64aac58d720775d6c8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:51.211 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fff8030cfbed4a64aac58d720775d6c8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:51.212 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.211 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fff8030cfbed4a64aac58d720775d6c8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:51.212 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.212 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:51.212 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:51.212 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.212 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fff8030cfbed4a64aac58d720775d6c8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:51.212 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.212 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:51.214 134138 DEBUG oslo_concurrency.lockutils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:51.213 134146 DEBUG oslo_concurrency.lockutils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:51.214 134145 DEBUG oslo_concurrency.lockutils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:51.214 134138 DEBUG oslo_concurrency.lockutils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:51.214 134145 DEBUG oslo_concurrency.lockutils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:51.214 134146 DEBUG oslo_concurrency.lockutils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:51.215 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:51.215 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:51.215 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:51.215 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.215 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.215 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:51.215 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:51.217 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.217 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:51.218 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fff8030cfbed4a64aac58d720775d6c8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:51.218 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.221 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fff8030cfbed4a64aac58d720775d6c8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:51.222 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.222 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:51.224 134140 DEBUG oslo_concurrency.lockutils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:51.226 134140 DEBUG oslo_concurrency.lockutils [req-b9ffbd38-3771-44a6-9e80-68136b71e616 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:51.226 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:51.226 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.226 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:51.435 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 233ab33c11af438891021dfe83fdd42a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:51.435 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 233ab33c11af438891021dfe83fdd42a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:51.436 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.436 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.436 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 233ab33c11af438891021dfe83fdd42a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:51.436 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 233ab33c11af438891021dfe83fdd42a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:51.436 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.436 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.436 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 233ab33c11af438891021dfe83fdd42a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:51.436 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:51.436 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:51.436 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.437 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 233ab33c11af438891021dfe83fdd42a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:51.437 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.437 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:51.438 134146 DEBUG oslo_concurrency.lockutils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:51.438 134138 DEBUG oslo_concurrency.lockutils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:51.438 134146 DEBUG oslo_concurrency.lockutils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:51.439 134138 DEBUG oslo_concurrency.lockutils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:51.439 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:51.439 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.439 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:51.439 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:51.439 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.439 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:51.440 134145 DEBUG oslo_concurrency.lockutils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:51.440 134145 DEBUG oslo_concurrency.lockutils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:51.441 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:51.441 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.441 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:51.442 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 233ab33c11af438891021dfe83fdd42a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:44:51.442 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.442 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 233ab33c11af438891021dfe83fdd42a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:44:51.442 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.443 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:51.444 134140 DEBUG oslo_concurrency.lockutils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:44:51.444 134140 DEBUG oslo_concurrency.lockutils [req-5e9c5e75-9a61-468b-aa1d-11c70fa9ebdf 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:44:51.445 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:51.445 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:51.445 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:52.440 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:52.440 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:52.440 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:52.441 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:52.442 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:52.442 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:52.442 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:52.442 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:52.443 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:52.448 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:52.449 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:52.449 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:54.443 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:54.444 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:54.445 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:54.444 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:54.446 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:54.446 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:54.445 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:54.446 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:54.447 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:54.450 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:54.450 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:54.451 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:58.447 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:58.448 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:58.448 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:58.448 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:58.449 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:58.449 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:58.450 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:58.450 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:58.450 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:44:58.454 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:44:58.454 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:44:58.454 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:02.052 134145 DEBUG oslo_service.periodic_task [req-4477c673-18dc-42e0-ab17-ef8069fb0aa8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:45:02.057 134145 DEBUG oslo_concurrency.lockutils [req-54084011-ba8d-438a-8b35-b71d8a042d3b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:02.057 134145 DEBUG oslo_concurrency.lockutils [req-54084011-ba8d-438a-8b35-b71d8a042d3b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:06.450 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:06.450 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:06.450 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:06.450 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:06.450 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:06.450 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:06.452 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:06.452 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:06.452 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:06.456 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:06.456 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:06.456 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:14.068 134146 DEBUG oslo_service.periodic_task [req-af13ef55-adb6-4433-88d6-5e15f3c8e2a9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:45:14.073 134146 DEBUG oslo_concurrency.lockutils [req-1544a178-8447-49a4-8a6d-551244d25331 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:14.073 134146 DEBUG oslo_concurrency.lockutils [req-1544a178-8447-49a4-8a6d-551244d25331 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:15.209 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 11748804380648b2a21d84ba596e1c60 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:45:15.209 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 11748804380648b2a21d84ba596e1c60 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:45:15.209 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 11748804380648b2a21d84ba596e1c60 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:45:15.210 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.210 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.210 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 11748804380648b2a21d84ba596e1c60 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:15.210 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 11748804380648b2a21d84ba596e1c60 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:15.210 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.210 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 11748804380648b2a21d84ba596e1c60 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:15.210 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.210 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.210 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:15.210 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:15.210 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.210 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:15.211 134138 DEBUG oslo_concurrency.lockutils [req-5489f73a-34ab-4969-adb0-1509c9f89de4 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:15.211 134140 DEBUG oslo_concurrency.lockutils [req-5489f73a-34ab-4969-adb0-1509c9f89de4 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:15.211 134145 DEBUG oslo_concurrency.lockutils [req-5489f73a-34ab-4969-adb0-1509c9f89de4 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:15.211 134138 DEBUG oslo_concurrency.lockutils [req-5489f73a-34ab-4969-adb0-1509c9f89de4 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:15.211 134140 DEBUG oslo_concurrency.lockutils [req-5489f73a-34ab-4969-adb0-1509c9f89de4 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:15.211 134145 DEBUG oslo_concurrency.lockutils [req-5489f73a-34ab-4969-adb0-1509c9f89de4 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:15.212 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:15.212 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:15.212 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:15.212 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.213 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.213 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.213 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:15.213 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:15.212 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 11748804380648b2a21d84ba596e1c60 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:45:15.213 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:15.213 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.213 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 11748804380648b2a21d84ba596e1c60 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:15.213 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.214 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:15.214 134146 DEBUG oslo_concurrency.lockutils [req-5489f73a-34ab-4969-adb0-1509c9f89de4 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:15.214 134146 DEBUG oslo_concurrency.lockutils [req-5489f73a-34ab-4969-adb0-1509c9f89de4 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:15.216 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:15.216 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.216 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:15.319 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: de93cb24d24042e9982fa370f7e74db9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:45:15.320 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: de93cb24d24042e9982fa370f7e74db9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:45:15.320 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.320 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: de93cb24d24042e9982fa370f7e74db9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:15.320 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.320 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.320 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: de93cb24d24042e9982fa370f7e74db9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:45:15.320 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:15.320 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: de93cb24d24042e9982fa370f7e74db9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:15.320 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.321 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.321 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: de93cb24d24042e9982fa370f7e74db9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:15.321 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:15.321 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.321 134145 DEBUG oslo_concurrency.lockutils [req-2bbd27fa-f01b-41ce-af84-74777c3abd40 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:15.321 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:15.321 134145 DEBUG oslo_concurrency.lockutils [req-2bbd27fa-f01b-41ce-af84-74777c3abd40 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:15.321 134140 DEBUG oslo_concurrency.lockutils [req-2bbd27fa-f01b-41ce-af84-74777c3abd40 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:15.322 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:15.322 134140 DEBUG oslo_concurrency.lockutils [req-2bbd27fa-f01b-41ce-af84-74777c3abd40 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:15.322 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.322 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:15.322 134146 DEBUG oslo_concurrency.lockutils [req-2bbd27fa-f01b-41ce-af84-74777c3abd40 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:15.322 134146 DEBUG oslo_concurrency.lockutils [req-2bbd27fa-f01b-41ce-af84-74777c3abd40 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:15.322 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:15.323 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:15.323 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.323 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: de93cb24d24042e9982fa370f7e74db9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:45:15.323 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.323 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:15.323 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:15.324 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.324 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: de93cb24d24042e9982fa370f7e74db9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:15.324 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.324 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:15.325 134138 DEBUG oslo_concurrency.lockutils [req-2bbd27fa-f01b-41ce-af84-74777c3abd40 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:15.325 134138 DEBUG oslo_concurrency.lockutils [req-2bbd27fa-f01b-41ce-af84-74777c3abd40 753419907c0d433d96ddef0aeb920a9e 7d53c701f51740daa18445a75434d879 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:15.326 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:15.327 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:15.327 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:16.323 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:16.323 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:16.323 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:16.324 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:16.324 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:16.325 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:16.325 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:16.325 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:16.325 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:16.328 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:16.328 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:16.328 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:18.326 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:18.326 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:18.326 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:18.327 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:18.327 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:18.327 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:18.327 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:18.327 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:18.328 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:18.330 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:18.330 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:18.330 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:19.020 134138 DEBUG oslo_service.periodic_task [req-eea4d2d6-3f75-459c-9f92-ba79d3a62f2d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:45:19.024 134138 DEBUG oslo_concurrency.lockutils [req-031767c9-1bae-4263-acf5-b0b3a01af821 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:19.025 134138 DEBUG oslo_concurrency.lockutils [req-031767c9-1bae-4263-acf5-b0b3a01af821 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:21.074 134140 DEBUG oslo_service.periodic_task [req-e874cd28-4c07-4fb2-b234-c722bc1a9523 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:45:21.078 134140 DEBUG oslo_concurrency.lockutils [req-74457521-dec6-497d-9819-a2dc043f8927 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:21.078 134140 DEBUG oslo_concurrency.lockutils [req-74457521-dec6-497d-9819-a2dc043f8927 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:22.327 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:22.328 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:22.328 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:22.329 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:22.329 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:22.329 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:22.329 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:22.329 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:22.330 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:22.332 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:22.332 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:22.332 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:30.332 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:30.333 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:30.333 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:30.334 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:30.334 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:30.335 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:30.335 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:30.335 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:30.335 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:30.338 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:30.339 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:30.339 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:32.065 134145 DEBUG oslo_service.periodic_task [req-54084011-ba8d-438a-8b35-b71d8a042d3b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:45:32.069 134145 DEBUG oslo_concurrency.lockutils [req-9ce24089-0978-4ab7-bb1e-fb4513f717db - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:32.070 134145 DEBUG oslo_concurrency.lockutils [req-9ce24089-0978-4ab7-bb1e-fb4513f717db - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:45.070 134146 DEBUG oslo_service.periodic_task [req-1544a178-8447-49a4-8a6d-551244d25331 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:45:45.073 134146 DEBUG oslo_concurrency.lockutils [req-9ec79110-bb75-48f8-804c-dbdce27ab8d2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:45.074 134146 DEBUG oslo_concurrency.lockutils [req-9ec79110-bb75-48f8-804c-dbdce27ab8d2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:46.334 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:46.334 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:46.334 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:46.336 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:46.336 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:46.336 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:46.338 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:46.338 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:46.338 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:46.340 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:46.340 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:46.340 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:49.029 134138 DEBUG oslo_service.periodic_task [req-031767c9-1bae-4263-acf5-b0b3a01af821 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:45:49.034 134138 DEBUG oslo_concurrency.lockutils [req-609aca63-6387-4686-8e3d-310453ae9aba - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:49.034 134138 DEBUG oslo_concurrency.lockutils [req-609aca63-6387-4686-8e3d-310453ae9aba - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:50.276 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 6223f89dba0c489994f005f2780a5f88 reply to reply_196060a39108420a873d953cb9a1134e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:45:50.277 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:50.277 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f1ba08197c5f47d0ab311412416244cd poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:50.278 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:50.278 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:50.280 134146 DEBUG nova.scheduler.manager [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['eec22cea-a194-4deb-ac95-b5f388fa7f5b'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:45:50.282 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:50.282 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:50.282 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:50.289 134146 DEBUG nova.scheduler.request_filter [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:45:50.290 134146 DEBUG nova.scheduler.request_filter [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:45:50.290 134146 DEBUG nova.scheduler.request_filter [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:45:50.290 134146 DEBUG nova.scheduler.request_filter [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:45:50.701 134146 DEBUG nova.scheduler.request_filter [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.4 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:45:50.707 134146 DEBUG oslo_concurrency.lockutils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:50.707 134146 DEBUG oslo_concurrency.lockutils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:51.083 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 1d542e6170e04f58a87b54da39889d3c reply to reply_979ed65011a945248f9b38f6dd19b658 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:45:51.084 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:51.084 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: cd91e165bdad405c938bba76723c0d0e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:51.084 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:51.084 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:51.086 134140 DEBUG oslo_service.periodic_task [req-74457521-dec6-497d-9819-a2dc043f8927 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:45:51.086 134138 DEBUG nova.scheduler.manager [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['8e1f18a2-bca6-4f46-8ea3-92122e133fe8'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:45:51.089 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:51.089 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:51.089 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:51.090 134140 DEBUG oslo_concurrency.lockutils [req-70c292fe-6ff8-498e-ba1a-919ac8b6b481 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:51.091 134140 DEBUG oslo_concurrency.lockutils [req-70c292fe-6ff8-498e-ba1a-919ac8b6b481 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:51.092 134138 DEBUG nova.scheduler.request_filter [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:45:51.092 134138 DEBUG nova.scheduler.request_filter [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:45:51.093 134138 DEBUG nova.scheduler.request_filter [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:45:51.093 134138 DEBUG nova.scheduler.request_filter [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:45:51.169 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 73dc3b58d81f4cb48900b6182f3e2968 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:45:51.174 134146 DEBUG oslo_concurrency.lockutils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:51.174 134146 DEBUG oslo_concurrency.lockutils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:51.187 134146 DEBUG oslo_concurrency.lockutils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:51.188 134146 DEBUG nova.scheduler.host_manager [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:45:25Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:45:51.190 134146 DEBUG nova.scheduler.host_manager [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:45:51.190 134146 DEBUG nova.scheduler.host_manager [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 380, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 45, 42, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 45, 42, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:45:51.190 134146 DEBUG nova.scheduler.host_manager [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:45:51.191 134146 DEBUG oslo_concurrency.lockutils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:51.191 134146 INFO nova.scheduler.host_manager [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:45:51.191 134146 DEBUG nova.scheduler.manager [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:45:51.192 134146 DEBUG nova.scheduler.manager [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:45:51.192 134146 DEBUG nova.scheduler.utils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance eec22cea-a194-4deb-ac95-b5f388fa7f5b claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:45:51.279 134146 DEBUG nova.scheduler.manager [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: eec22cea-a194-4deb-ac95-b5f388fa7f5b] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:45:51.280 134146 DEBUG oslo_concurrency.lockutils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:51.280 134146 DEBUG oslo_concurrency.lockutils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:51.282 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 2aeea017f7ea45a99acef4714646d116 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:45:51.283 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:51.284 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:51.284 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:51.288 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 6223f89dba0c489994f005f2780a5f88 reply queue: reply_196060a39108420a873d953cb9a1134e time elapsed: 1.0111395689991696s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:45:51.585 134138 DEBUG nova.scheduler.request_filter [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.5 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:45:51.601 134138 DEBUG oslo_concurrency.lockutils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:51.602 134138 DEBUG oslo_concurrency.lockutils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:51.687 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: f699ecc88ac64cec8ebb5d248780154f NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:45:51.691 134138 DEBUG oslo_concurrency.lockutils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:51.691 134138 DEBUG oslo_concurrency.lockutils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:51.723 134138 DEBUG oslo_concurrency.lockutils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:51.724 134138 DEBUG nova.scheduler.host_manager [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:45:25Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:45:51.725 134138 DEBUG nova.scheduler.host_manager [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:45:51.725 134138 DEBUG nova.scheduler.host_manager [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 380, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 45, 42, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 45, 42, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:45:51.726 134138 DEBUG nova.scheduler.host_manager [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:45:51.726 134138 DEBUG oslo_concurrency.lockutils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:51.726 134138 INFO nova.scheduler.host_manager [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:45:51.727 134138 DEBUG nova.scheduler.manager [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:45:51.727 134138 DEBUG nova.scheduler.manager [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:45:51.728 134138 DEBUG nova.scheduler.utils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance 8e1f18a2-bca6-4f46-8ea3-92122e133fe8 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:45:51.892 134138 DEBUG nova.scheduler.manager [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: 8e1f18a2-bca6-4f46-8ea3-92122e133fe8] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:45:51.893 134138 DEBUG oslo_concurrency.lockutils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:51.894 134138 DEBUG oslo_concurrency.lockutils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:51.895 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: b9391f78ff7446acaf2676c392640d48 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:45:51.902 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 1d542e6170e04f58a87b54da39889d3c reply queue: reply_979ed65011a945248f9b38f6dd19b658 time elapsed: 0.8180840899995019s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:45:52.090 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:52.091 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:52.091 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:53.287 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:53.287 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:53.287 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:54.093 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:54.094 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:54.094 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:56.929 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3a44c90d51b94047bfb29813633be6e0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:45:56.929 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3a44c90d51b94047bfb29813633be6e0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:45:56.929 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3a44c90d51b94047bfb29813633be6e0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:45:56.930 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.930 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.930 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.930 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3a44c90d51b94047bfb29813633be6e0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:56.930 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3a44c90d51b94047bfb29813633be6e0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:56.930 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3a44c90d51b94047bfb29813633be6e0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:56.930 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.930 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.930 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.930 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:56.930 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:56.930 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:56.932 134145 DEBUG oslo_concurrency.lockutils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:56.932 134145 DEBUG oslo_concurrency.lockutils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:56.932 134146 DEBUG oslo_concurrency.lockutils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:56.932 134138 DEBUG oslo_concurrency.lockutils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:56.932 134146 DEBUG oslo_concurrency.lockutils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:56.932 134138 DEBUG oslo_concurrency.lockutils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:56.933 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:56.933 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.933 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:56.933 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:56.933 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.933 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:56.933 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:56.933 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.933 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:56.934 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3a44c90d51b94047bfb29813633be6e0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:45:56.934 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.934 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3a44c90d51b94047bfb29813633be6e0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:56.934 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.935 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:56.937 134140 DEBUG oslo_concurrency.lockutils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:56.938 134140 DEBUG oslo_concurrency.lockutils [req-bdd9334d-3f39-4a17-a3a4-4b109b4d8aa9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:56.938 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:56.938 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.938 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:56.979 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 54977c638d8b4681bef6356e40ca9d7e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:45:56.979 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 54977c638d8b4681bef6356e40ca9d7e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:45:56.979 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 54977c638d8b4681bef6356e40ca9d7e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:45:56.980 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.980 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.980 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 54977c638d8b4681bef6356e40ca9d7e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:56.980 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 54977c638d8b4681bef6356e40ca9d7e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:56.980 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.980 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.980 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.980 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:56.980 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 54977c638d8b4681bef6356e40ca9d7e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:56.980 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:56.980 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.981 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:56.982 134140 DEBUG oslo_concurrency.lockutils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:56.982 134140 DEBUG oslo_concurrency.lockutils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:56.982 134146 DEBUG oslo_concurrency.lockutils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:56.982 134145 DEBUG oslo_concurrency.lockutils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:56.983 134146 DEBUG oslo_concurrency.lockutils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:56.983 134145 DEBUG oslo_concurrency.lockutils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:56.983 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:56.983 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.983 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:56.983 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:56.983 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:56.983 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.983 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.983 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:56.984 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:56.985 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 54977c638d8b4681bef6356e40ca9d7e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:45:56.986 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.986 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 54977c638d8b4681bef6356e40ca9d7e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:45:56.987 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.987 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:56.991 134138 DEBUG oslo_concurrency.lockutils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:45:56.991 134138 DEBUG oslo_concurrency.lockutils [req-05a23ce8-3dcc-405c-8a89-5bd22cf4b412 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:45:56.991 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:56.993 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:56.993 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:57.984 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:57.984 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:57.984 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:57.984 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:57.984 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:57.984 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:57.985 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:57.985 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:57.985 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:57.994 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:57.994 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:57.994 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:59.987 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:59.987 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:59.987 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:59.987 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:59.987 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:59.987 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:59.988 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:59.988 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:59.988 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:45:59.995 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:45:59.995 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:45:59.996 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:03.052 134145 DEBUG oslo_service.periodic_task [req-9ce24089-0978-4ab7-bb1e-fb4513f717db - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:46:03.056 134145 DEBUG oslo_concurrency.lockutils [req-bb387a36-b70f-4a37-b241-d6ce412bfed1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:03.057 134145 DEBUG oslo_concurrency.lockutils [req-bb387a36-b70f-4a37-b241-d6ce412bfed1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:03.989 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:03.989 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:03.989 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:03.989 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:03.990 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:03.989 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:03.990 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:03.990 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:03.990 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:03.997 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:03.997 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:03.997 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:11.994 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:11.994 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:11.994 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:11.996 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:11.996 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:11.996 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:11.996 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:11.996 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:11.996 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:12.003 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:12.004 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:12.004 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:15.082 134146 DEBUG oslo_service.periodic_task [req-9ec79110-bb75-48f8-804c-dbdce27ab8d2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:46:15.086 134146 DEBUG oslo_concurrency.lockutils [req-0849b5ba-43b4-4027-bc96-274a345b2af3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:15.086 134146 DEBUG oslo_concurrency.lockutils [req-0849b5ba-43b4-4027-bc96-274a345b2af3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:19.043 134138 DEBUG oslo_service.periodic_task [req-609aca63-6387-4686-8e3d-310453ae9aba - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:46:19.047 134138 DEBUG oslo_concurrency.lockutils [req-91f21f3a-ff6e-422e-9f50-d0d8bde6f93a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:19.047 134138 DEBUG oslo_concurrency.lockutils [req-91f21f3a-ff6e-422e-9f50-d0d8bde6f93a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:19.155 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ed99195a67d049d383491740dd1c3aea __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:19.156 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ed99195a67d049d383491740dd1c3aea __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:19.156 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.156 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ed99195a67d049d383491740dd1c3aea poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:19.156 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ed99195a67d049d383491740dd1c3aea __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:19.156 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.156 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.156 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ed99195a67d049d383491740dd1c3aea poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:19.156 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:19.156 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.156 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.157 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ed99195a67d049d383491740dd1c3aea poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:19.157 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:19.157 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.157 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:19.157 134145 DEBUG oslo_concurrency.lockutils [req-18d4b18d-e15a-49c8-ab91-80816102619a 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:19.157 134146 DEBUG oslo_concurrency.lockutils [req-18d4b18d-e15a-49c8-ab91-80816102619a 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:19.157 134145 DEBUG oslo_concurrency.lockutils [req-18d4b18d-e15a-49c8-ab91-80816102619a 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:19.158 134140 DEBUG oslo_concurrency.lockutils [req-18d4b18d-e15a-49c8-ab91-80816102619a 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:19.158 134146 DEBUG oslo_concurrency.lockutils [req-18d4b18d-e15a-49c8-ab91-80816102619a 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:19.158 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:19.158 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.158 134140 DEBUG oslo_concurrency.lockutils [req-18d4b18d-e15a-49c8-ab91-80816102619a 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:19.158 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:19.158 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:19.158 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.158 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:19.158 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:19.158 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.159 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:19.159 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ed99195a67d049d383491740dd1c3aea __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:19.159 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.160 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ed99195a67d049d383491740dd1c3aea poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:19.160 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.160 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:19.161 134138 DEBUG oslo_concurrency.lockutils [req-18d4b18d-e15a-49c8-ab91-80816102619a 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:19.162 134138 DEBUG oslo_concurrency.lockutils [req-18d4b18d-e15a-49c8-ab91-80816102619a 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:19.162 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:19.162 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.162 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:19.231 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c7b04c79cdab4b988ac0e8e2470b2803 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:19.231 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c7b04c79cdab4b988ac0e8e2470b2803 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:19.231 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.231 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.231 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c7b04c79cdab4b988ac0e8e2470b2803 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:19.231 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c7b04c79cdab4b988ac0e8e2470b2803 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:19.231 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.232 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:19.232 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.232 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:19.232 134145 DEBUG oslo_concurrency.lockutils [req-1547b3fc-074f-4ba5-be8b-38be7d1d25a9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:19.232 134146 DEBUG oslo_concurrency.lockutils [req-1547b3fc-074f-4ba5-be8b-38be7d1d25a9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:19.232 134145 DEBUG oslo_concurrency.lockutils [req-1547b3fc-074f-4ba5-be8b-38be7d1d25a9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:19.233 134146 DEBUG oslo_concurrency.lockutils [req-1547b3fc-074f-4ba5-be8b-38be7d1d25a9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:19.233 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c7b04c79cdab4b988ac0e8e2470b2803 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:19.234 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c7b04c79cdab4b988ac0e8e2470b2803 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:19.234 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:19.235 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.235 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:19.234 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:19.236 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.236 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:19.234 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.234 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.236 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c7b04c79cdab4b988ac0e8e2470b2803 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:19.237 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.237 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:19.238 134138 DEBUG oslo_concurrency.lockutils [req-1547b3fc-074f-4ba5-be8b-38be7d1d25a9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:19.236 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c7b04c79cdab4b988ac0e8e2470b2803 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:19.239 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.239 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:19.240 134140 DEBUG oslo_concurrency.lockutils [req-1547b3fc-074f-4ba5-be8b-38be7d1d25a9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:19.240 134140 DEBUG oslo_concurrency.lockutils [req-1547b3fc-074f-4ba5-be8b-38be7d1d25a9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:19.241 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:19.238 134138 DEBUG oslo_concurrency.lockutils [req-1547b3fc-074f-4ba5-be8b-38be7d1d25a9 73f938ad9965433285982b7734c1a728 017bb5cc016c43a6ae997e9d36f8335f - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:19.241 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:19.241 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.241 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:19.241 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:19.242 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:20.236 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:20.236 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:20.236 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:20.237 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:20.237 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:20.237 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:20.243 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:20.243 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:20.243 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:20.243 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:20.243 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:20.243 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:22.045 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 621b311f32424239b550e3ff9cd59520 reply to reply_b20bc86560af44159724b4fea607f8c2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:46:22.046 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:22.046 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4219bc4a125347959e5ec832f4bdbbda poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:22.046 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:22.046 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:22.048 134145 DEBUG nova.scheduler.manager [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['b72d8253-a42f-4219-950f-35f929fc24cd'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:46:22.049 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:22.049 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:22.050 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:22.052 134140 DEBUG oslo_service.periodic_task [req-70c292fe-6ff8-498e-ba1a-919ac8b6b481 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:46:22.054 134145 DEBUG nova.scheduler.request_filter [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:46:22.054 134145 DEBUG nova.scheduler.request_filter [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:46:22.055 134145 DEBUG nova.scheduler.request_filter [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:46:22.055 134145 DEBUG nova.scheduler.request_filter [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:46:22.055 134145 DEBUG nova.scheduler.request_filter [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:46:22.056 134140 DEBUG oslo_concurrency.lockutils [req-b765e0f2-4f03-4570-83d2-ed1ec1cb1ff1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:22.057 134140 DEBUG oslo_concurrency.lockutils [req-b765e0f2-4f03-4570-83d2-ed1ec1cb1ff1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:22.059 134145 DEBUG oslo_concurrency.lockutils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:22.060 134145 DEBUG oslo_concurrency.lockutils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:22.123 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: bcda9a221373402a9fc7c224d82336f7 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:46:22.125 134145 DEBUG oslo_concurrency.lockutils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:22.125 134145 DEBUG oslo_concurrency.lockutils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:22.136 134145 DEBUG oslo_concurrency.lockutils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:22.136 134145 DEBUG nova.scheduler.host_manager [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_017bb5cc016c43a6ae997e9d36f8335f='0',num_task_None='0',num_vm_building='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:46:19Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:46:22.137 134145 DEBUG nova.scheduler.host_manager [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:46:22.137 134145 DEBUG nova.scheduler.host_manager [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 383, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 46, 12, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 46, 12, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:46:22.138 134145 DEBUG nova.scheduler.host_manager [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:46:22.138 134145 DEBUG oslo_concurrency.lockutils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:22.138 134145 INFO nova.scheduler.host_manager [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:46:22.138 134145 DEBUG nova.scheduler.manager [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:46:22.139 134145 DEBUG nova.scheduler.manager [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:46:22.139 134145 DEBUG nova.scheduler.utils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance b72d8253-a42f-4219-950f-35f929fc24cd claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:46:22.239 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:22.239 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:22.240 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:22.244 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:22.244 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:22.244 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:22.245 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:22.245 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:22.246 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:22.253 134145 DEBUG nova.scheduler.manager [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: b72d8253-a42f-4219-950f-35f929fc24cd] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:46:22.253 134145 DEBUG oslo_concurrency.lockutils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:22.254 134145 DEBUG oslo_concurrency.lockutils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:22.256 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: c79b3b8de5114810919a10844289a8a3 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:46:22.258 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 621b311f32424239b550e3ff9cd59520 reply queue: reply_b20bc86560af44159724b4fea607f8c2 time elapsed: 0.212624501999926s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:46:23.051 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:23.051 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:23.051 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:23.209 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: f9958ac65f9248ebb0b7cfa059a8989f reply to reply_b20bc86560af44159724b4fea607f8c2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:46:23.210 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:23.210 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c24504cd53cc4757ab96c0e6655e7b40 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:23.210 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:23.210 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:23.212 134140 DEBUG nova.scheduler.manager [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['1a92ceed-7d70-4668-90d5-8426fd0a5bfb'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:46:23.213 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:23.214 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:23.214 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:23.219 134140 DEBUG nova.scheduler.request_filter [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:46:23.220 134140 DEBUG nova.scheduler.request_filter [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:46:23.221 134140 DEBUG nova.scheduler.request_filter [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:46:23.222 134140 DEBUG nova.scheduler.request_filter [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:46:23.222 134140 DEBUG nova.scheduler.request_filter [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:46:23.229 134140 DEBUG oslo_concurrency.lockutils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:23.230 134140 DEBUG oslo_concurrency.lockutils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:23.288 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 5e4d74ff338b40a490f606b8a9e01613 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:46:23.290 134140 DEBUG oslo_concurrency.lockutils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:23.290 134140 DEBUG oslo_concurrency.lockutils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:23.303 134140 DEBUG oslo_concurrency.lockutils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:23.303 134140 DEBUG nova.scheduler.host_manager [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_017bb5cc016c43a6ae997e9d36f8335f='0',num_proj_a93fc792f60d4003a604d33d572b43bb='1',num_task_None='1',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:46:23Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:46:23.304 134140 DEBUG nova.scheduler.host_manager [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:46:23.304 134140 DEBUG nova.scheduler.host_manager [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 384, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 46, 22, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 46, 22, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:46:23.305 134140 DEBUG nova.scheduler.host_manager [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:46:23.305 134140 DEBUG oslo_concurrency.lockutils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:23.305 134140 INFO nova.scheduler.host_manager [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:46:23.305 134140 DEBUG nova.scheduler.manager [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:46:23.305 134140 DEBUG nova.scheduler.manager [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:46:23.306 134140 DEBUG nova.scheduler.utils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance 1a92ceed-7d70-4668-90d5-8426fd0a5bfb claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:46:23.386 134140 DEBUG nova.scheduler.manager [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: 1a92ceed-7d70-4668-90d5-8426fd0a5bfb] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:46:23.387 134140 DEBUG oslo_concurrency.lockutils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:23.387 134140 DEBUG oslo_concurrency.lockutils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:23.388 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 272682e4e1e64c16913d30dc6aad46c5 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:46:23.390 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: f9958ac65f9248ebb0b7cfa059a8989f reply queue: reply_b20bc86560af44159724b4fea607f8c2 time elapsed: 0.18048404499950266s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:46:24.215 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:24.216 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:24.216 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:25.053 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:25.053 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:25.054 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.217 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:26.217 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.217 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.241 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:26.241 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.242 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.247 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:26.248 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.248 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.383 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 648cb6c417724d959f014c83c7847972 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:26.383 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 648cb6c417724d959f014c83c7847972 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:26.383 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 648cb6c417724d959f014c83c7847972 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:26.383 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 648cb6c417724d959f014c83c7847972 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:26.384 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.384 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.384 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.384 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 648cb6c417724d959f014c83c7847972 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:26.384 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 648cb6c417724d959f014c83c7847972 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:26.384 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 648cb6c417724d959f014c83c7847972 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:26.384 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.384 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.384 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 648cb6c417724d959f014c83c7847972 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:26.384 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.384 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.384 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.384 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.384 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.384 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.385 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.386 134138 DEBUG oslo_concurrency.lockutils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:26.386 134140 DEBUG oslo_concurrency.lockutils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:26.387 134140 DEBUG oslo_concurrency.lockutils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:26.387 134138 DEBUG oslo_concurrency.lockutils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:26.387 134145 DEBUG oslo_concurrency.lockutils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:26.387 134146 DEBUG oslo_concurrency.lockutils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:26.388 134145 DEBUG oslo_concurrency.lockutils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:26.387 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:26.388 134146 DEBUG oslo_concurrency.lockutils [req-3dac3e6f-c43f-459d-858d-6bc2663bbfcd 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:26.388 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:26.388 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.388 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.388 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.388 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:26.388 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.388 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.388 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.388 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:26.388 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.389 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.697 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: e3062d2d59084739907223f30ab68b66 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:26.697 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: e3062d2d59084739907223f30ab68b66 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:26.697 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: e3062d2d59084739907223f30ab68b66 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:26.697 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: e3062d2d59084739907223f30ab68b66 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:26.698 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.698 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.698 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: e3062d2d59084739907223f30ab68b66 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:26.698 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: e3062d2d59084739907223f30ab68b66 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:26.698 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.698 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: e3062d2d59084739907223f30ab68b66 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:26.698 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.698 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.698 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.698 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.698 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.698 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.698 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: e3062d2d59084739907223f30ab68b66 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:26.698 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.698 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.698 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.700 134138 DEBUG oslo_concurrency.lockutils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:26.700 134145 DEBUG oslo_concurrency.lockutils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:26.700 134138 DEBUG oslo_concurrency.lockutils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:26.700 134145 DEBUG oslo_concurrency.lockutils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:26.700 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:26.700 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:26.701 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.701 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.701 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.701 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.701 134146 DEBUG oslo_concurrency.lockutils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:26.701 134146 DEBUG oslo_concurrency.lockutils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:26.701 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:26.701 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.701 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:26.702 134140 DEBUG oslo_concurrency.lockutils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:26.702 134140 DEBUG oslo_concurrency.lockutils [req-59ae07b8-9b03-42f7-b2ab-9bbec663be76 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:26.702 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:26.703 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:26.703 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:27.702 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:27.702 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:27.703 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:27.703 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:27.703 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:27.704 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:27.704 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:27.704 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:27.704 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:27.705 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:27.705 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:27.705 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:28.455 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1f4d7b45b69f49a093b1db88239775a3 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:28.455 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1f4d7b45b69f49a093b1db88239775a3 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:28.455 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:28.455 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:28.455 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1f4d7b45b69f49a093b1db88239775a3 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:28.456 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1f4d7b45b69f49a093b1db88239775a3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:28.456 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1f4d7b45b69f49a093b1db88239775a3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:28.456 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:28.456 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:28.456 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:28.456 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:28.456 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1f4d7b45b69f49a093b1db88239775a3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:28.456 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:28.456 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:28.456 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:28.456 134140 DEBUG oslo_concurrency.lockutils [req-a1dd123d-9769-4982-a84c-305efd541dd6 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:28.456 134146 DEBUG oslo_concurrency.lockutils [req-a1dd123d-9769-4982-a84c-305efd541dd6 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:28.457 134140 DEBUG nova.scheduler.host_manager [req-a1dd123d-9769-4982-a84c-305efd541dd6 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:46:28.457 134146 DEBUG nova.scheduler.host_manager [req-a1dd123d-9769-4982-a84c-305efd541dd6 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:46:28.457 134140 DEBUG oslo_concurrency.lockutils [req-a1dd123d-9769-4982-a84c-305efd541dd6 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:28.457 134145 DEBUG oslo_concurrency.lockutils [req-a1dd123d-9769-4982-a84c-305efd541dd6 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:28.457 134146 DEBUG oslo_concurrency.lockutils [req-a1dd123d-9769-4982-a84c-305efd541dd6 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:28.457 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:28.457 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:28.457 134145 DEBUG nova.scheduler.host_manager [req-a1dd123d-9769-4982-a84c-305efd541dd6 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:46:28.457 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:28.457 134145 DEBUG oslo_concurrency.lockutils [req-a1dd123d-9769-4982-a84c-305efd541dd6 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:28.458 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:28.458 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:28.458 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1f4d7b45b69f49a093b1db88239775a3 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:46:28.458 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:28.458 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:28.458 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:28.458 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:28.459 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:28.459 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1f4d7b45b69f49a093b1db88239775a3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:46:28.459 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:28.459 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:28.460 134138 DEBUG oslo_concurrency.lockutils [req-a1dd123d-9769-4982-a84c-305efd541dd6 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:28.460 134138 DEBUG nova.scheduler.host_manager [req-a1dd123d-9769-4982-a84c-305efd541dd6 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:46:28.460 134138 DEBUG oslo_concurrency.lockutils [req-a1dd123d-9769-4982-a84c-305efd541dd6 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:28.461 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:28.461 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:28.461 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:29.459 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:29.459 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:29.459 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:29.460 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:29.460 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:29.460 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:29.460 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:29.460 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:29.460 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:29.462 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:29.462 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:29.462 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:31.462 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:31.462 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:31.462 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:31.462 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:31.463 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:31.463 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:31.463 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:31.464 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:31.464 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:31.464 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:31.465 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:31.466 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:33.063 134145 DEBUG oslo_service.periodic_task [req-bb387a36-b70f-4a37-b241-d6ce412bfed1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:46:33.067 134145 DEBUG oslo_concurrency.lockutils [req-07d9afd9-2d3c-490d-a6a0-fc3d8a00c2b9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:33.068 134145 DEBUG oslo_concurrency.lockutils [req-07d9afd9-2d3c-490d-a6a0-fc3d8a00c2b9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:35.465 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:35.465 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:35.465 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:35.466 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:35.468 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:35.468 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:35.467 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:35.467 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:35.468 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:35.468 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:35.468 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:35.469 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:43.473 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:43.473 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:43.473 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:43.476 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:43.476 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:43.476 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:43.477 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:43.477 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:43.477 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:43.477 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:43.477 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:43.477 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:45.093 134146 DEBUG oslo_service.periodic_task [req-0849b5ba-43b4-4027-bc96-274a345b2af3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:46:45.098 134146 DEBUG oslo_concurrency.lockutils [req-5bd7b15c-c1d0-4b46-bcdb-ab89d79c36b1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:45.099 134146 DEBUG oslo_concurrency.lockutils [req-5bd7b15c-c1d0-4b46-bcdb-ab89d79c36b1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:50.019 134138 DEBUG oslo_service.periodic_task [req-91f21f3a-ff6e-422e-9f50-d0d8bde6f93a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:46:50.024 134138 DEBUG oslo_concurrency.lockutils [req-9c0995a6-2539-442e-b368-0bed6e84e8df - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:50.024 134138 DEBUG oslo_concurrency.lockutils [req-9c0995a6-2539-442e-b368-0bed6e84e8df - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:53.053 134140 DEBUG oslo_service.periodic_task [req-b765e0f2-4f03-4570-83d2-ed1ec1cb1ff1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:46:53.057 134140 DEBUG oslo_concurrency.lockutils [req-2b250b47-fea0-431d-8f01-fdc52f365266 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:46:53.057 134140 DEBUG oslo_concurrency.lockutils [req-2b250b47-fea0-431d-8f01-fdc52f365266 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:46:59.475 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:59.476 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:59.477 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:59.478 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:59.478 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:59.478 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:59.480 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:59.480 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:46:59.480 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:59.480 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:46:59.480 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:46:59.480 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:03.783 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7a688fb1acea440c98812223c470a4d9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:03.783 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7a688fb1acea440c98812223c470a4d9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:03.783 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.784 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7a688fb1acea440c98812223c470a4d9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:03.784 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.784 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:03.784 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.784 134138 DEBUG oslo_concurrency.lockutils [req-525e019a-fb6a-48b5-8341-3081fbd3684c 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:03.785 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7a688fb1acea440c98812223c470a4d9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:03.784 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7a688fb1acea440c98812223c470a4d9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:03.785 134138 DEBUG oslo_concurrency.lockutils [req-525e019a-fb6a-48b5-8341-3081fbd3684c 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:03.785 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.785 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7a688fb1acea440c98812223c470a4d9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:03.785 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.786 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:03.786 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7a688fb1acea440c98812223c470a4d9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:03.786 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:03.787 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.787 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:03.787 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.787 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7a688fb1acea440c98812223c470a4d9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:03.787 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.787 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:03.786 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.788 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:03.788 134140 DEBUG oslo_concurrency.lockutils [req-525e019a-fb6a-48b5-8341-3081fbd3684c 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:03.787 134146 DEBUG oslo_concurrency.lockutils [req-525e019a-fb6a-48b5-8341-3081fbd3684c 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:03.788 134140 DEBUG oslo_concurrency.lockutils [req-525e019a-fb6a-48b5-8341-3081fbd3684c 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:03.789 134146 DEBUG oslo_concurrency.lockutils [req-525e019a-fb6a-48b5-8341-3081fbd3684c 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:03.789 134145 DEBUG oslo_concurrency.lockutils [req-525e019a-fb6a-48b5-8341-3081fbd3684c 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:03.790 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:03.790 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.790 134145 DEBUG oslo_concurrency.lockutils [req-525e019a-fb6a-48b5-8341-3081fbd3684c 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:03.790 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:03.790 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:03.790 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.791 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:03.791 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:03.791 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.791 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:03.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8d5e08905348445c97e5726422ce6548 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:03.846 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8d5e08905348445c97e5726422ce6548 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:03.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8d5e08905348445c97e5726422ce6548 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:03.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8d5e08905348445c97e5726422ce6548 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:03.846 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.846 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8d5e08905348445c97e5726422ce6548 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:03.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:03.846 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.847 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:03.847 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8d5e08905348445c97e5726422ce6548 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:03.847 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.847 134138 DEBUG oslo_concurrency.lockutils [req-7c84da6e-2fa8-47d6-bfdf-62faef3c49e4 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:03.847 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8d5e08905348445c97e5726422ce6548 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:03.847 134138 DEBUG oslo_concurrency.lockutils [req-7c84da6e-2fa8-47d6-bfdf-62faef3c49e4 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:03.847 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.848 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:03.847 134140 DEBUG oslo_concurrency.lockutils [req-7c84da6e-2fa8-47d6-bfdf-62faef3c49e4 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:03.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.848 134140 DEBUG oslo_concurrency.lockutils [req-7c84da6e-2fa8-47d6-bfdf-62faef3c49e4 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:03.848 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8d5e08905348445c97e5726422ce6548 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:03.848 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.848 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:03.848 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:03.848 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.848 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:03.849 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:03.849 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.849 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:03.849 134146 DEBUG oslo_concurrency.lockutils [req-7c84da6e-2fa8-47d6-bfdf-62faef3c49e4 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:03.849 134146 DEBUG oslo_concurrency.lockutils [req-7c84da6e-2fa8-47d6-bfdf-62faef3c49e4 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:03.850 134145 DEBUG oslo_concurrency.lockutils [req-7c84da6e-2fa8-47d6-bfdf-62faef3c49e4 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:03.851 134145 DEBUG oslo_concurrency.lockutils [req-7c84da6e-2fa8-47d6-bfdf-62faef3c49e4 6e72bfe3f671413bb062aaf6680165ce a93fc792f60d4003a604d33d572b43bb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:03.850 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:03.851 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.851 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:03.852 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:03.852 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:03.852 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:04.051 134145 DEBUG oslo_service.periodic_task [req-07d9afd9-2d3c-490d-a6a0-fc3d8a00c2b9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:47:04.055 134145 DEBUG oslo_concurrency.lockutils [req-2ca499c1-9211-4165-af82-e209ae1ea825 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:04.056 134145 DEBUG oslo_concurrency.lockutils [req-2ca499c1-9211-4165-af82-e209ae1ea825 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:04.850 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:04.851 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:04.851 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:04.850 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:04.852 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:04.852 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:04.852 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:04.852 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:04.852 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:04.853 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:04.853 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:04.853 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:06.854 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:06.854 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:06.854 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:06.854 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:06.854 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:06.854 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:06.855 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:06.855 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:06.855 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:06.855 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:06.856 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:06.856 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:10.858 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:10.858 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:10.858 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:10.858 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:10.858 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:10.858 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:10.858 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:10.858 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:10.858 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:10.858 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:10.859 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:10.859 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:13.609 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: bdf2bf2ae43649f3842f19b708836bf3 reply to reply_979ed65011a945248f9b38f6dd19b658 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:47:13.609 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:13.609 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c7e5af7ed89740eeb3a41fd6def47c31 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:13.609 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:13.609 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:13.611 134146 DEBUG nova.scheduler.manager [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['4b6fd549-c9da-4486-8f1b-35425d85d713'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:47:13.613 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:13.613 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:13.613 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:13.616 134146 DEBUG nova.scheduler.request_filter [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:47:13.617 134146 DEBUG nova.scheduler.request_filter [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:47:13.617 134146 DEBUG nova.scheduler.request_filter [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:47:13.617 134146 DEBUG nova.scheduler.request_filter [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:47:13.618 134146 DEBUG nova.scheduler.request_filter [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:47:13.622 134146 DEBUG oslo_concurrency.lockutils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:13.623 134146 DEBUG oslo_concurrency.lockutils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:13.678 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 81faff90918045349c369a3ad558cb19 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:47:13.680 134146 DEBUG oslo_concurrency.lockutils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:13.680 134146 DEBUG oslo_concurrency.lockutils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:13.691 134146 DEBUG oslo_concurrency.lockutils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:13.691 134146 DEBUG nova.scheduler.host_manager [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_a93fc792f60d4003a604d33d572b43bb='0',num_task_None='0',num_task_spawning='0',num_vm_active='0',num_vm_building='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:47:04Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:47:13.692 134146 DEBUG nova.scheduler.host_manager [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:47:13.692 134146 DEBUG nova.scheduler.host_manager [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 389, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 47, 12, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 47, 12, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:47:13.693 134146 DEBUG nova.scheduler.host_manager [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:47:13.693 134146 DEBUG oslo_concurrency.lockutils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:13.693 134146 INFO nova.scheduler.host_manager [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:47:13.694 134146 DEBUG nova.scheduler.manager [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:47:13.694 134146 DEBUG nova.scheduler.manager [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:47:13.694 134146 DEBUG nova.scheduler.utils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance 4b6fd549-c9da-4486-8f1b-35425d85d713 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:47:13.786 134146 DEBUG nova.scheduler.manager [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: 4b6fd549-c9da-4486-8f1b-35425d85d713] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:47:13.786 134146 DEBUG oslo_concurrency.lockutils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:13.787 134146 DEBUG oslo_concurrency.lockutils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:13.788 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: b324c07b243f454987b6066de2784e10 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:47:13.790 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: bdf2bf2ae43649f3842f19b708836bf3 reply queue: reply_979ed65011a945248f9b38f6dd19b658 time elapsed: 0.18065623799975583s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:47:14.614 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:14.645 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:14.645 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:15.290 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: c545609993bb49a98cf0a1ff72ac1616 reply to reply_196060a39108420a873d953cb9a1134e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:47:15.291 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:15.291 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6f4317c70d604bbea4d2345f2af99698 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:15.291 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:15.291 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:15.293 134138 DEBUG nova.scheduler.manager [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['a0edd942-d39d-481d-8b09-04a64d1ef199'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:47:15.319 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:15.358 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:15.358 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:15.362 134138 DEBUG nova.scheduler.request_filter [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:47:15.362 134138 DEBUG nova.scheduler.request_filter [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:47:15.363 134138 DEBUG nova.scheduler.request_filter [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:47:15.363 134138 DEBUG nova.scheduler.request_filter [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:47:15.363 134138 DEBUG nova.scheduler.request_filter [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:47:15.368 134138 DEBUG oslo_concurrency.lockutils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:15.369 134138 DEBUG oslo_concurrency.lockutils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:15.490 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: b3ae2a673c624b15ab527c2b5005b60b NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:47:15.492 134138 DEBUG oslo_concurrency.lockutils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:15.492 134138 DEBUG oslo_concurrency.lockutils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:15.502 134138 DEBUG oslo_concurrency.lockutils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:15.502 134138 DEBUG nova.scheduler.host_manager [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_73d51924cf2a478b945fb4df11ac484e='1',num_proj_a93fc792f60d4003a604d33d572b43bb='0',num_task_None='1',num_task_spawning='0',num_vm_active='0',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:47:14Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:47:15.503 134138 DEBUG nova.scheduler.host_manager [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:47:15.503 134138 DEBUG nova.scheduler.host_manager [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 389, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 47, 12, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 47, 12, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:47:15.503 134138 DEBUG nova.scheduler.host_manager [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:47:15.504 134138 DEBUG oslo_concurrency.lockutils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:15.504 134138 INFO nova.scheduler.host_manager [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:47:15.504 134138 DEBUG nova.scheduler.manager [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:47:15.504 134138 DEBUG nova.scheduler.manager [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:47:15.505 134138 DEBUG nova.scheduler.utils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance a0edd942-d39d-481d-8b09-04a64d1ef199 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:47:15.781 134138 DEBUG nova.scheduler.manager [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: a0edd942-d39d-481d-8b09-04a64d1ef199] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:47:15.781 134138 DEBUG oslo_concurrency.lockutils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:15.781 134138 DEBUG oslo_concurrency.lockutils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:15.783 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: a6febe0d7d1244b7aa734584ed1ea9dc NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:47:15.784 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: c545609993bb49a98cf0a1ff72ac1616 reply queue: reply_196060a39108420a873d953cb9a1134e time elapsed: 0.49370863400054077s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:47:16.068 134146 DEBUG oslo_service.periodic_task [req-5bd7b15c-c1d0-4b46-bcdb-ab89d79c36b1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:47:16.073 134146 DEBUG oslo_concurrency.lockutils [req-c4e8b2ef-5996-462e-8706-3994f298de83 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:16.074 134146 DEBUG oslo_concurrency.lockutils [req-c4e8b2ef-5996-462e-8706-3994f298de83 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:16.359 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:16.359 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:16.359 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:16.647 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:16.647 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:16.647 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:18.125 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 94cec9ecbee7409399cfaa2613f1620e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:18.125 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 94cec9ecbee7409399cfaa2613f1620e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:18.125 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 94cec9ecbee7409399cfaa2613f1620e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:18.125 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:18.125 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:18.125 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 94cec9ecbee7409399cfaa2613f1620e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:18.125 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:18.125 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 94cec9ecbee7409399cfaa2613f1620e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:18.125 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 94cec9ecbee7409399cfaa2613f1620e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:18.126 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:18.126 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:18.126 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:18.126 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:18.126 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:18.126 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:18.127 134140 DEBUG oslo_concurrency.lockutils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:18.128 134146 DEBUG oslo_concurrency.lockutils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:18.128 134146 DEBUG oslo_concurrency.lockutils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:18.128 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:18.128 134145 DEBUG oslo_concurrency.lockutils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:18.128 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:18.128 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:18.128 134145 DEBUG oslo_concurrency.lockutils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:18.128 134140 DEBUG oslo_concurrency.lockutils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:18.129 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:18.129 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:18.129 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:18.129 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:18.129 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:18.129 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:18.130 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 94cec9ecbee7409399cfaa2613f1620e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:18.131 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:18.131 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 94cec9ecbee7409399cfaa2613f1620e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:18.131 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:18.131 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:18.134 134138 DEBUG oslo_concurrency.lockutils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:18.135 134138 DEBUG oslo_concurrency.lockutils [req-661fdac5-5524-4088-affe-dcba0985e127 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:18.135 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:18.135 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:18.135 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:19.129 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:19.130 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:19.130 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:19.130 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:19.130 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:19.130 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:19.131 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:19.131 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:19.131 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:19.137 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:19.138 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:19.139 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:19.428 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fee0800a7a1a440fa11e3b8bcf57fc51 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:19.428 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fee0800a7a1a440fa11e3b8bcf57fc51 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:19.428 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:19.428 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:19.428 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fee0800a7a1a440fa11e3b8bcf57fc51 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:19.429 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fee0800a7a1a440fa11e3b8bcf57fc51 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:19.429 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:19.429 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:19.429 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:19.429 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:19.431 134140 DEBUG oslo_concurrency.lockutils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:19.431 134140 DEBUG oslo_concurrency.lockutils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:19.432 134145 DEBUG oslo_concurrency.lockutils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:19.432 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:19.432 134145 DEBUG oslo_concurrency.lockutils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:19.432 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:19.432 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:19.432 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:19.433 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:19.433 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:19.433 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fee0800a7a1a440fa11e3b8bcf57fc51 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:19.434 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fee0800a7a1a440fa11e3b8bcf57fc51 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:19.434 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:19.434 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:19.434 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fee0800a7a1a440fa11e3b8bcf57fc51 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:19.434 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fee0800a7a1a440fa11e3b8bcf57fc51 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:19.434 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:19.434 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:19.434 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:19.434 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:19.436 134138 DEBUG oslo_concurrency.lockutils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:19.436 134146 DEBUG oslo_concurrency.lockutils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:19.437 134138 DEBUG oslo_concurrency.lockutils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:19.438 134146 DEBUG oslo_concurrency.lockutils [req-6fea5260-613c-4a9b-91e0-9f30d47eae67 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:19.438 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:19.438 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:19.438 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:19.438 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:19.439 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:19.439 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:20.029 134138 DEBUG oslo_service.periodic_task [req-9c0995a6-2539-442e-b368-0bed6e84e8df - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:47:20.033 134138 DEBUG oslo_concurrency.lockutils [req-9d81e57b-bf9f-4ba9-96a4-4f0f8cd47209 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:20.034 134138 DEBUG oslo_concurrency.lockutils [req-9d81e57b-bf9f-4ba9-96a4-4f0f8cd47209 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:20.434 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:20.434 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:20.434 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:20.434 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:20.435 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:20.435 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:20.439 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:20.439 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:20.439 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:20.440 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:20.441 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:20.441 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:22.436 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:22.436 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:22.436 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:22.436 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:22.436 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:22.436 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:22.441 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:22.441 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:22.441 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:22.443 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:22.443 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:22.444 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:23.062 134140 DEBUG oslo_service.periodic_task [req-2b250b47-fea0-431d-8f01-fdc52f365266 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:47:23.067 134140 DEBUG oslo_concurrency.lockutils [req-75bb617e-0023-4adc-a4f1-4b78dbda5495 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:23.067 134140 DEBUG oslo_concurrency.lockutils [req-75bb617e-0023-4adc-a4f1-4b78dbda5495 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:26.437 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:26.437 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:26.438 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:26.437 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:26.438 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:26.438 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:26.443 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:26.444 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:26.444 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:26.445 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:26.446 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:26.446 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:34.439 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:34.439 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:34.439 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:34.441 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:34.442 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:34.442 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:34.447 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:34.448 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:34.448 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:34.451 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:34.452 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:34.452 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:35.051 134145 DEBUG oslo_service.periodic_task [req-2ca499c1-9211-4165-af82-e209ae1ea825 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:47:35.056 134145 DEBUG oslo_concurrency.lockutils [req-a49302bf-7d4b-4a73-b33a-ef5715a6ec42 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:35.056 134145 DEBUG oslo_concurrency.lockutils [req-a49302bf-7d4b-4a73-b33a-ef5715a6ec42 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:36.614 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ff23b1771c3b4ddea4a839e74716f2f4 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:36.614 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ff23b1771c3b4ddea4a839e74716f2f4 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:36.614 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ff23b1771c3b4ddea4a839e74716f2f4 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:36.614 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.614 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.614 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.614 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ff23b1771c3b4ddea4a839e74716f2f4 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:36.614 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ff23b1771c3b4ddea4a839e74716f2f4 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:36.614 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ff23b1771c3b4ddea4a839e74716f2f4 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:36.615 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.615 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.615 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.615 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:36.615 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:36.615 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:36.615 134140 DEBUG oslo_concurrency.lockutils [req-44a50b0a-0c97-4835-99f8-45ff0e7f0caf 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:36.615 134138 DEBUG oslo_concurrency.lockutils [req-44a50b0a-0c97-4835-99f8-45ff0e7f0caf 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:36.616 134146 DEBUG oslo_concurrency.lockutils [req-44a50b0a-0c97-4835-99f8-45ff0e7f0caf 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:36.616 134140 DEBUG oslo_concurrency.lockutils [req-44a50b0a-0c97-4835-99f8-45ff0e7f0caf 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:36.616 134146 DEBUG oslo_concurrency.lockutils [req-44a50b0a-0c97-4835-99f8-45ff0e7f0caf 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:36.616 134138 DEBUG oslo_concurrency.lockutils [req-44a50b0a-0c97-4835-99f8-45ff0e7f0caf 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:36.617 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ff23b1771c3b4ddea4a839e74716f2f4 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:36.617 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.617 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:36.617 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:36.617 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:36.617 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.617 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.617 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:36.617 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.618 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:36.618 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:36.617 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ff23b1771c3b4ddea4a839e74716f2f4 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:36.618 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.618 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:36.619 134145 DEBUG oslo_concurrency.lockutils [req-44a50b0a-0c97-4835-99f8-45ff0e7f0caf 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:36.619 134145 DEBUG oslo_concurrency.lockutils [req-44a50b0a-0c97-4835-99f8-45ff0e7f0caf 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:36.620 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:36.621 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.621 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:36.663 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 85d762af369c42628d03c61e407ed1c7 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:36.663 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 85d762af369c42628d03c61e407ed1c7 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:36.663 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 85d762af369c42628d03c61e407ed1c7 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:36.663 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.663 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.663 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.664 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 85d762af369c42628d03c61e407ed1c7 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:36.664 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 85d762af369c42628d03c61e407ed1c7 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:36.664 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 85d762af369c42628d03c61e407ed1c7 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:36.664 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.664 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.664 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.664 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:36.664 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:36.664 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:36.664 134145 DEBUG oslo_concurrency.lockutils [req-9a600489-c15e-4f0e-baba-c038abf089bb 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:36.664 134140 DEBUG oslo_concurrency.lockutils [req-9a600489-c15e-4f0e-baba-c038abf089bb 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:36.664 134138 DEBUG oslo_concurrency.lockutils [req-9a600489-c15e-4f0e-baba-c038abf089bb 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:36.665 134145 DEBUG oslo_concurrency.lockutils [req-9a600489-c15e-4f0e-baba-c038abf089bb 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:36.665 134140 DEBUG oslo_concurrency.lockutils [req-9a600489-c15e-4f0e-baba-c038abf089bb 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:36.665 134138 DEBUG oslo_concurrency.lockutils [req-9a600489-c15e-4f0e-baba-c038abf089bb 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:36.665 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 85d762af369c42628d03c61e407ed1c7 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:47:36.666 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:36.666 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:36.667 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.667 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.667 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:36.667 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:36.666 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:36.666 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.669 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 85d762af369c42628d03c61e407ed1c7 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:47:36.669 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.669 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:36.669 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.669 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:36.670 134146 DEBUG oslo_concurrency.lockutils [req-9a600489-c15e-4f0e-baba-c038abf089bb 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:36.670 134146 DEBUG oslo_concurrency.lockutils [req-9a600489-c15e-4f0e-baba-c038abf089bb 41af5728f5da4632b309c59a6dba57a5 73d51924cf2a478b945fb4df11ac484e - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:36.672 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:36.672 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:36.672 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:37.667 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:37.668 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:37.668 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:37.668 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:37.669 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:37.669 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:37.670 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:37.671 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:37.671 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:37.672 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:37.673 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:37.673 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:39.670 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:39.670 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:39.670 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:39.670 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:39.670 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:39.670 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:39.673 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:39.674 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:39.674 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:39.675 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:39.675 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:39.675 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:43.674 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:43.674 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:43.674 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:43.674 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:43.674 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:43.675 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:43.678 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:43.678 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:43.678 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:43.679 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:43.680 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:43.680 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:46.078 134146 DEBUG oslo_service.periodic_task [req-c4e8b2ef-5996-462e-8706-3994f298de83 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:47:46.083 134146 DEBUG oslo_concurrency.lockutils [req-49657a0a-89a9-497a-b5c0-4d30e4d5c58d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:46.083 134146 DEBUG oslo_concurrency.lockutils [req-49657a0a-89a9-497a-b5c0-4d30e4d5c58d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:50.042 134138 DEBUG oslo_service.periodic_task [req-9d81e57b-bf9f-4ba9-96a4-4f0f8cd47209 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:47:50.049 134138 DEBUG oslo_concurrency.lockutils [req-bebd4749-82f5-4615-9afe-19390738057d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:50.049 134138 DEBUG oslo_concurrency.lockutils [req-bebd4749-82f5-4615-9afe-19390738057d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:47:51.677 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:51.677 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:51.678 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:51.678 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:51.678 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:51.678 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:51.681 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:51.681 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:51.681 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:51.681 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:47:51.682 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:47:51.682 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:47:54.052 134140 DEBUG oslo_service.periodic_task [req-75bb617e-0023-4adc-a4f1-4b78dbda5495 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:47:54.057 134140 DEBUG oslo_concurrency.lockutils [req-ac64b215-7d74-4f3a-b5a6-b944dd0f4d86 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:47:54.057 134140 DEBUG oslo_concurrency.lockutils [req-ac64b215-7d74-4f3a-b5a6-b944dd0f4d86 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:06.052 134145 DEBUG oslo_service.periodic_task [req-a49302bf-7d4b-4a73-b33a-ef5715a6ec42 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:48:06.056 134145 DEBUG oslo_concurrency.lockutils [req-c1214e8f-a9fb-44bc-8be0-924be597284e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:06.056 134145 DEBUG oslo_concurrency.lockutils [req-c1214e8f-a9fb-44bc-8be0-924be597284e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:07.679 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:07.679 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:07.679 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:07.679 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:07.679 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:07.679 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:07.684 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:07.684 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:07.684 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:07.684 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:07.684 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:07.684 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:14.973 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 21ea9c5ac36e47668ce5834e4d1221d6 reply to reply_b20bc86560af44159724b4fea607f8c2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:48:14.974 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:14.974 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3fb19a4cd71344a694aa5027cd36245a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:14.974 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:14.974 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:14.976 134145 DEBUG nova.scheduler.manager [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['8c6385ee-86bc-4746-95ac-920d5d4a19b2'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:48:14.978 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:14.978 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:14.979 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:14.983 134145 DEBUG nova.scheduler.request_filter [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:48:14.983 134145 DEBUG nova.scheduler.request_filter [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:48:14.983 134145 DEBUG nova.scheduler.request_filter [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:48:14.983 134145 DEBUG nova.scheduler.request_filter [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:48:14.983 134145 DEBUG nova.scheduler.request_filter [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:48:14.987 134145 DEBUG oslo_concurrency.lockutils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:14.988 134145 DEBUG oslo_concurrency.lockutils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:15.041 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: e382372c1c7f4012ad5ecabc59249857 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:48:15.043 134145 DEBUG oslo_concurrency.lockutils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:15.043 134145 DEBUG oslo_concurrency.lockutils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:15.054 134145 DEBUG oslo_concurrency.lockutils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:15.054 134145 DEBUG nova.scheduler.host_manager [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_73d51924cf2a478b945fb4df11ac484e='0',num_task_None='0',num_vm_active='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:47:37Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:48:15.055 134145 DEBUG nova.scheduler.host_manager [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:48:15.056 134145 DEBUG nova.scheduler.host_manager [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 395, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 48, 12, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 48, 12, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:48:15.056 134145 DEBUG nova.scheduler.host_manager [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:48:15.056 134145 DEBUG oslo_concurrency.lockutils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:15.056 134145 INFO nova.scheduler.host_manager [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:48:15.058 134145 DEBUG nova.scheduler.manager [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:48:15.058 134145 DEBUG nova.scheduler.manager [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:48:15.058 134145 DEBUG nova.scheduler.utils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance 8c6385ee-86bc-4746-95ac-920d5d4a19b2 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:48:15.148 134145 DEBUG nova.scheduler.manager [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: 8c6385ee-86bc-4746-95ac-920d5d4a19b2] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:48:15.148 134145 DEBUG oslo_concurrency.lockutils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:15.150 134145 DEBUG oslo_concurrency.lockutils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:15.152 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: d57685de76784b17aedbb581d8b72b0c NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:48:15.153 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 21ea9c5ac36e47668ce5834e4d1221d6 reply queue: reply_b20bc86560af44159724b4fea607f8c2 time elapsed: 0.17977827800041268s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:48:15.980 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:15.980 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:15.980 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:16.776 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: c0cb8aa7739547a5a6653e432e101809 reply to reply_979ed65011a945248f9b38f6dd19b658 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:48:16.777 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:16.777 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 94e00ba27f95495ab65df03346c07d79 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:16.778 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:16.778 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:16.780 134140 DEBUG nova.scheduler.manager [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['b79fd545-97e8-47ad-adb4-77ccfd0da77d'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:48:16.785 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:16.785 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:16.785 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:16.786 134140 DEBUG nova.scheduler.request_filter [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:48:16.786 134140 DEBUG nova.scheduler.request_filter [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:48:16.786 134140 DEBUG nova.scheduler.request_filter [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:48:16.787 134140 DEBUG nova.scheduler.request_filter [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:48:16.787 134140 DEBUG nova.scheduler.request_filter [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:48:16.792 134140 DEBUG oslo_concurrency.lockutils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:16.792 134140 DEBUG oslo_concurrency.lockutils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:16.862 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 0e0ddc8db14d4283838b4c801b24c958 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:48:16.864 134140 DEBUG oslo_concurrency.lockutils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:16.865 134140 DEBUG oslo_concurrency.lockutils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:16.883 134140 DEBUG oslo_concurrency.lockutils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:16.884 134140 DEBUG nova.scheduler.host_manager [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_73d51924cf2a478b945fb4df11ac484e='0',num_proj_bbe543f6c8ad405a838c992bdd870466='1',num_task_None='1',num_vm_active='0',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:48:16Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:48:16.885 134140 DEBUG nova.scheduler.host_manager [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:48:16.885 134140 DEBUG nova.scheduler.host_manager [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 395, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 48, 12, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 48, 12, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:48:16.885 134140 DEBUG nova.scheduler.host_manager [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:48:16.886 134140 DEBUG oslo_concurrency.lockutils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:16.886 134140 INFO nova.scheduler.host_manager [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:48:16.886 134140 DEBUG nova.scheduler.manager [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:48:16.886 134140 DEBUG nova.scheduler.manager [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:48:16.886 134140 DEBUG nova.scheduler.utils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance b79fd545-97e8-47ad-adb4-77ccfd0da77d claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:48:16.963 134140 DEBUG nova.scheduler.manager [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: b79fd545-97e8-47ad-adb4-77ccfd0da77d] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:48:16.963 134140 DEBUG oslo_concurrency.lockutils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:16.964 134140 DEBUG oslo_concurrency.lockutils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:16.966 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 5ae8bbf2ae5e4d68a18c58b3a278b81f NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:48:16.968 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: c0cb8aa7739547a5a6653e432e101809 reply queue: reply_979ed65011a945248f9b38f6dd19b658 time elapsed: 0.19141144700006407s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:48:17.070 134146 DEBUG oslo_service.periodic_task [req-49657a0a-89a9-497a-b5c0-4d30e4d5c58d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:48:17.074 134146 DEBUG oslo_concurrency.lockutils [req-d13b055b-47c7-4175-84b1-5f50f8b8019a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:17.074 134146 DEBUG oslo_concurrency.lockutils [req-d13b055b-47c7-4175-84b1-5f50f8b8019a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:17.786 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:17.787 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:17.787 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:17.982 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:17.982 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:17.982 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:18.633 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 80719040ba0848d398348b911bfc3a72 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:18.633 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 80719040ba0848d398348b911bfc3a72 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:18.633 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 80719040ba0848d398348b911bfc3a72 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:18.634 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:18.634 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:18.634 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 80719040ba0848d398348b911bfc3a72 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:18.634 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 80719040ba0848d398348b911bfc3a72 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:18.634 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:18.634 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:18.634 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:18.634 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:18.634 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:18.634 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 80719040ba0848d398348b911bfc3a72 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:18.634 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:18.635 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:18.636 134140 DEBUG oslo_concurrency.lockutils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:18.636 134145 DEBUG oslo_concurrency.lockutils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:18.636 134140 DEBUG oslo_concurrency.lockutils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:18.636 134145 DEBUG oslo_concurrency.lockutils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:18.637 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:18.636 134138 DEBUG oslo_concurrency.lockutils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:18.637 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:18.637 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:18.637 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:18.637 134138 DEBUG oslo_concurrency.lockutils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:18.637 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:18.637 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:18.637 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:18.638 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:18.638 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:18.639 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 80719040ba0848d398348b911bfc3a72 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:18.639 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:18.639 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 80719040ba0848d398348b911bfc3a72 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:18.640 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:18.640 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:18.642 134146 DEBUG oslo_concurrency.lockutils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:18.642 134146 DEBUG oslo_concurrency.lockutils [req-96586c70-a0ac-45be-ae9e-09c21c9c8a93 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:18.642 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:18.643 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:18.643 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:19.638 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:19.638 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:19.639 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:19.639 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:19.639 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:19.639 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:19.639 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:19.639 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:19.640 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:19.644 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:19.645 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:19.645 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:20.055 134138 DEBUG oslo_service.periodic_task [req-bebd4749-82f5-4615-9afe-19390738057d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:48:20.059 134138 DEBUG oslo_concurrency.lockutils [req-d40a0a22-4aa6-4b9d-8957-5620a7c8f7dc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:20.059 134138 DEBUG oslo_concurrency.lockutils [req-d40a0a22-4aa6-4b9d-8957-5620a7c8f7dc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:20.220 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c04c8bce728b437493c50c86a0dba2da __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:20.220 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c04c8bce728b437493c50c86a0dba2da __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:20.220 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c04c8bce728b437493c50c86a0dba2da __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:20.221 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:20.221 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c04c8bce728b437493c50c86a0dba2da poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:20.221 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:20.221 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:20.221 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:20.221 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:20.221 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c04c8bce728b437493c50c86a0dba2da poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:20.221 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c04c8bce728b437493c50c86a0dba2da poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:20.221 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:20.221 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:20.221 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:20.222 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:20.224 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c04c8bce728b437493c50c86a0dba2da __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:20.224 134145 DEBUG oslo_concurrency.lockutils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:20.224 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:20.224 134145 DEBUG oslo_concurrency.lockutils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:20.224 134146 DEBUG oslo_concurrency.lockutils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:20.224 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c04c8bce728b437493c50c86a0dba2da poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:20.224 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:20.224 134146 DEBUG oslo_concurrency.lockutils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:20.224 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:20.225 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:20.225 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:20.225 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:20.225 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:20.225 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:20.225 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:20.226 134138 DEBUG oslo_concurrency.lockutils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:20.227 134138 DEBUG oslo_concurrency.lockutils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:20.228 134140 DEBUG oslo_concurrency.lockutils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:20.228 134140 DEBUG oslo_concurrency.lockutils [req-44ae708c-a072-4fe9-bb2e-a67de6ee7426 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:20.228 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:20.228 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:20.229 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:20.231 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:20.231 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:20.231 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:21.226 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:21.226 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:21.226 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:21.226 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:21.227 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:21.227 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:21.230 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:21.230 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:21.230 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:21.232 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:21.232 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:21.233 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:23.228 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:23.228 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:23.228 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:23.229 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:23.229 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:23.229 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:23.232 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:23.232 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:23.232 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:23.234 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:23.234 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:23.234 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:24.062 134140 DEBUG oslo_service.periodic_task [req-ac64b215-7d74-4f3a-b5a6-b944dd0f4d86 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:48:24.067 134140 DEBUG oslo_concurrency.lockutils [req-6f10c606-7821-46b1-9658-6ad1b256ac36 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:24.067 134140 DEBUG oslo_concurrency.lockutils [req-6f10c606-7821-46b1-9658-6ad1b256ac36 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:27.230 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:27.230 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:27.231 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:27.231 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:27.232 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:27.232 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:27.234 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:27.234 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:27.234 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:27.236 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:27.237 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:27.237 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:29.899 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c978ddeb05c64d7788f8ee5116343e25 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:29.899 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c978ddeb05c64d7788f8ee5116343e25 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:29.900 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c978ddeb05c64d7788f8ee5116343e25 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:29.900 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:29.900 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:29.900 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c978ddeb05c64d7788f8ee5116343e25 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:29.900 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:29.900 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c978ddeb05c64d7788f8ee5116343e25 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:29.900 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c978ddeb05c64d7788f8ee5116343e25 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:29.900 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:29.900 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:29.900 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:29.900 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:29.900 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:29.900 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:29.901 134145 DEBUG oslo_concurrency.lockutils [req-a09741d6-e40a-4572-b007-db58767b378b - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:29.901 134146 DEBUG oslo_concurrency.lockutils [req-a09741d6-e40a-4572-b007-db58767b378b - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:29.901 134145 DEBUG nova.scheduler.host_manager [req-a09741d6-e40a-4572-b007-db58767b378b - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:48:29.901 134138 DEBUG oslo_concurrency.lockutils [req-a09741d6-e40a-4572-b007-db58767b378b - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:29.901 134146 DEBUG nova.scheduler.host_manager [req-a09741d6-e40a-4572-b007-db58767b378b - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:48:29.901 134145 DEBUG oslo_concurrency.lockutils [req-a09741d6-e40a-4572-b007-db58767b378b - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:29.901 134138 DEBUG nova.scheduler.host_manager [req-a09741d6-e40a-4572-b007-db58767b378b - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:48:29.901 134138 DEBUG oslo_concurrency.lockutils [req-a09741d6-e40a-4572-b007-db58767b378b - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:29.901 134146 DEBUG oslo_concurrency.lockutils [req-a09741d6-e40a-4572-b007-db58767b378b - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:29.902 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:29.902 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:29.902 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:29.902 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:29.902 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:29.902 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:29.903 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:29.903 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:29.903 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c978ddeb05c64d7788f8ee5116343e25 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:29.903 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:29.903 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:29.903 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c978ddeb05c64d7788f8ee5116343e25 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:29.903 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:29.903 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:29.904 134140 DEBUG oslo_concurrency.lockutils [req-a09741d6-e40a-4572-b007-db58767b378b - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:29.904 134140 DEBUG nova.scheduler.host_manager [req-a09741d6-e40a-4572-b007-db58767b378b - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:48:29.904 134140 DEBUG oslo_concurrency.lockutils [req-a09741d6-e40a-4572-b007-db58767b378b - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:29.905 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:29.905 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:29.905 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:30.903 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:30.903 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:30.903 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:30.903 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:30.903 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:30.904 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:30.904 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:30.905 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:30.905 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:30.906 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:30.906 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:30.906 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:32.906 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:32.906 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:32.906 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:32.906 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:32.906 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:32.906 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:32.907 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:32.907 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:32.907 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:32.907 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:32.907 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:32.908 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:36.062 134145 DEBUG oslo_service.periodic_task [req-c1214e8f-a9fb-44bc-8be0-924be597284e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:48:36.065 134145 DEBUG oslo_concurrency.lockutils [req-ad4b37b4-8971-4848-8bb0-959d4a6636f0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:36.066 134145 DEBUG oslo_concurrency.lockutils [req-ad4b37b4-8971-4848-8bb0-959d4a6636f0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:36.908 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:36.908 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:36.908 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:36.908 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:36.909 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:36.909 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:36.909 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:36.909 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:36.909 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:36.909 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:36.909 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:36.910 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:40.059 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 096edf270eb84647bae97fec7d0f7df9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:40.059 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 096edf270eb84647bae97fec7d0f7df9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:40.059 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 096edf270eb84647bae97fec7d0f7df9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:40.060 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.060 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.060 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.060 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 096edf270eb84647bae97fec7d0f7df9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:40.060 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 096edf270eb84647bae97fec7d0f7df9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:40.060 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 096edf270eb84647bae97fec7d0f7df9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:40.060 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.060 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.060 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:40.060 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.060 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:40.060 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:40.061 134145 DEBUG oslo_concurrency.lockutils [req-f165e84f-dd95-414b-8454-5bf6ab106169 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:40.061 134146 DEBUG oslo_concurrency.lockutils [req-f165e84f-dd95-414b-8454-5bf6ab106169 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:40.061 134138 DEBUG oslo_concurrency.lockutils [req-f165e84f-dd95-414b-8454-5bf6ab106169 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:40.061 134145 DEBUG oslo_concurrency.lockutils [req-f165e84f-dd95-414b-8454-5bf6ab106169 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:40.061 134146 DEBUG oslo_concurrency.lockutils [req-f165e84f-dd95-414b-8454-5bf6ab106169 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:40.061 134138 DEBUG oslo_concurrency.lockutils [req-f165e84f-dd95-414b-8454-5bf6ab106169 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:40.062 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 096edf270eb84647bae97fec7d0f7df9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:40.062 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:40.063 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:40.063 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.063 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:40.063 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.063 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:40.063 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.063 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 096edf270eb84647bae97fec7d0f7df9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:40.063 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.063 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:40.062 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:40.064 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.064 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:40.064 134140 DEBUG oslo_concurrency.lockutils [req-f165e84f-dd95-414b-8454-5bf6ab106169 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:40.064 134140 DEBUG oslo_concurrency.lockutils [req-f165e84f-dd95-414b-8454-5bf6ab106169 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:40.065 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:40.066 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.067 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:40.143 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8c003eaeb38e43299c8c9f0be6c17c49 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:40.143 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8c003eaeb38e43299c8c9f0be6c17c49 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:40.143 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.144 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.144 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8c003eaeb38e43299c8c9f0be6c17c49 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:40.144 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8c003eaeb38e43299c8c9f0be6c17c49 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:40.144 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.144 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.144 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:40.144 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:40.145 134145 DEBUG oslo_concurrency.lockutils [req-62448b15-eb2a-4637-8b07-396674c3cc7a 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:40.145 134145 DEBUG oslo_concurrency.lockutils [req-62448b15-eb2a-4637-8b07-396674c3cc7a 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:40.145 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8c003eaeb38e43299c8c9f0be6c17c49 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:40.145 134138 DEBUG oslo_concurrency.lockutils [req-62448b15-eb2a-4637-8b07-396674c3cc7a 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:40.145 134138 DEBUG oslo_concurrency.lockutils [req-62448b15-eb2a-4637-8b07-396674c3cc7a 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:40.145 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.146 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8c003eaeb38e43299c8c9f0be6c17c49 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:40.146 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:40.146 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.146 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:40.147 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8c003eaeb38e43299c8c9f0be6c17c49 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:40.146 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.147 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.146 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:40.148 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8c003eaeb38e43299c8c9f0be6c17c49 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:40.148 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:40.148 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.148 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:40.148 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.148 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:40.148 134146 DEBUG oslo_concurrency.lockutils [req-62448b15-eb2a-4637-8b07-396674c3cc7a 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:40.149 134146 DEBUG oslo_concurrency.lockutils [req-62448b15-eb2a-4637-8b07-396674c3cc7a 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:40.149 134140 DEBUG oslo_concurrency.lockutils [req-62448b15-eb2a-4637-8b07-396674c3cc7a 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:40.149 134140 DEBUG oslo_concurrency.lockutils [req-62448b15-eb2a-4637-8b07-396674c3cc7a 9f841ff7578c4b70aff28b139d7e9902 bbe543f6c8ad405a838c992bdd870466 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:40.150 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:40.151 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.151 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:40.150 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:40.151 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:40.152 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:41.149 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:41.149 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:41.149 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:41.150 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:41.150 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:41.150 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:41.152 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:41.152 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:41.152 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:41.153 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:41.153 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:41.153 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:43.151 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:43.151 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:43.152 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:43.152 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:43.152 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:43.153 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:43.154 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:43.154 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:43.155 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:43.155 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:43.155 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:43.155 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:47.154 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:47.154 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:47.154 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:47.155 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:47.155 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:47.155 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:47.155 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:47.155 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:47.155 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:47.157 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:47.157 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:47.157 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:48.069 134146 DEBUG oslo_service.periodic_task [req-d13b055b-47c7-4175-84b1-5f50f8b8019a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:48:48.073 134146 DEBUG oslo_concurrency.lockutils [req-f6512447-05bc-4c7b-8912-33725562e2c2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:48.073 134146 DEBUG oslo_concurrency.lockutils [req-f6512447-05bc-4c7b-8912-33725562e2c2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:50.064 134138 DEBUG oslo_service.periodic_task [req-d40a0a22-4aa6-4b9d-8957-5620a7c8f7dc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:48:50.069 134138 DEBUG oslo_concurrency.lockutils [req-644cce0e-dadb-435c-b969-e25096dc502e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:50.070 134138 DEBUG oslo_concurrency.lockutils [req-644cce0e-dadb-435c-b969-e25096dc502e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:50.792 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 4f4392fe3dbc447fb67a6d002f056b4d reply to reply_b20bc86560af44159724b4fea607f8c2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:48:50.793 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:50.793 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9c168f81b5724952a1cc24cac3558ce6 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:50.793 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:50.793 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:50.795 134146 DEBUG nova.scheduler.manager [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['5b722877-3fde-46e3-b26f-45746d2724c1'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:48:50.796 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:50.796 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:50.796 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:50.799 134146 DEBUG nova.scheduler.request_filter [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:48:50.800 134146 DEBUG nova.scheduler.request_filter [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:48:50.800 134146 DEBUG nova.scheduler.request_filter [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:48:50.800 134146 DEBUG nova.scheduler.request_filter [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:48:50.800 134146 DEBUG nova.scheduler.request_filter [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:48:50.805 134146 DEBUG oslo_concurrency.lockutils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:50.805 134146 DEBUG oslo_concurrency.lockutils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:50.899 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 26ac338765f442c68a8e81bf9d4dd96a NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:48:50.901 134146 DEBUG oslo_concurrency.lockutils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:50.901 134146 DEBUG oslo_concurrency.lockutils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:50.910 134146 DEBUG oslo_concurrency.lockutils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:50.910 134146 DEBUG nova.scheduler.host_manager [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_bbe543f6c8ad405a838c992bdd870466='0',num_task_None='0',num_vm_active='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:48:40Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:48:50.912 134146 DEBUG nova.scheduler.host_manager [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:48:50.912 134146 DEBUG nova.scheduler.host_manager [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 398, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 48, 42, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 48, 42, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:48:50.912 134146 DEBUG nova.scheduler.host_manager [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:48:50.912 134146 DEBUG oslo_concurrency.lockutils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:50.912 134146 INFO nova.scheduler.host_manager [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:48:50.913 134146 DEBUG nova.scheduler.manager [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:48:50.913 134146 DEBUG nova.scheduler.manager [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:48:50.913 134146 DEBUG nova.scheduler.utils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance 5b722877-3fde-46e3-b26f-45746d2724c1 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:48:51.009 134146 DEBUG nova.scheduler.manager [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: 5b722877-3fde-46e3-b26f-45746d2724c1] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:48:51.009 134146 DEBUG oslo_concurrency.lockutils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:51.010 134146 DEBUG oslo_concurrency.lockutils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:51.011 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: dd98feda638d4a9d9311a20c6a044937 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:48:51.012 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 4f4392fe3dbc447fb67a6d002f056b4d reply queue: reply_b20bc86560af44159724b4fea607f8c2 time elapsed: 0.2198127290002958s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:48:51.798 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:51.798 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:51.799 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:51.967 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 54f4100242154568bd7a3549ea0c9747 reply to reply_b20bc86560af44159724b4fea607f8c2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:48:51.967 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:51.967 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ef5e49446a164a0db15f7c2d8138c70a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:51.968 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:51.968 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:51.969 134138 DEBUG nova.scheduler.manager [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['ed156984-2f97-4816-8ed7-76fde9ce90d6'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:48:51.971 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:51.971 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:51.971 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:51.974 134138 DEBUG nova.scheduler.request_filter [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:48:51.975 134138 DEBUG nova.scheduler.request_filter [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:48:51.975 134138 DEBUG nova.scheduler.request_filter [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:48:51.975 134138 DEBUG nova.scheduler.request_filter [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:48:51.976 134138 DEBUG nova.scheduler.request_filter [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:48:51.980 134138 DEBUG oslo_concurrency.lockutils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:51.981 134138 DEBUG oslo_concurrency.lockutils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:52.044 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 719c207668124028bd8af88c32e1a6ce NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:48:52.046 134138 DEBUG oslo_concurrency.lockutils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:52.046 134138 DEBUG oslo_concurrency.lockutils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:52.055 134138 DEBUG oslo_concurrency.lockutils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:52.056 134138 DEBUG nova.scheduler.host_manager [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_50feff4feb574037aefd0bf043ed97fb='1',num_proj_bbe543f6c8ad405a838c992bdd870466='0',num_task_None='1',num_vm_active='0',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:48:51Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:48:52.057 134138 DEBUG nova.scheduler.host_manager [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:48:52.057 134138 DEBUG nova.scheduler.host_manager [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 398, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 48, 42, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 48, 42, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:48:52.057 134138 DEBUG nova.scheduler.host_manager [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:48:52.058 134138 DEBUG oslo_concurrency.lockutils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:52.058 134138 INFO nova.scheduler.host_manager [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:48:52.060 134138 DEBUG nova.scheduler.manager [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:48:52.060 134138 DEBUG nova.scheduler.manager [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:48:52.061 134138 DEBUG nova.scheduler.utils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance ed156984-2f97-4816-8ed7-76fde9ce90d6 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:48:52.124 134138 DEBUG nova.scheduler.manager [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: ed156984-2f97-4816-8ed7-76fde9ce90d6] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:48:52.125 134138 DEBUG oslo_concurrency.lockutils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:52.125 134138 DEBUG oslo_concurrency.lockutils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:52.127 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: f2f86ba95a6d4e5989f6ffdc2dea804d NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:48:52.128 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 54f4100242154568bd7a3549ea0c9747 reply queue: reply_b20bc86560af44159724b4fea607f8c2 time elapsed: 0.16093724599977577s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:48:52.973 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:52.974 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:52.974 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:53.801 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:53.801 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:53.801 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:54.078 134140 DEBUG oslo_service.periodic_task [req-6f10c606-7821-46b1-9658-6ad1b256ac36 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:48:54.082 134140 DEBUG oslo_concurrency.lockutils [req-84fd97cf-b1d7-4f0b-8e2f-b129ed946e66 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:54.083 134140 DEBUG oslo_concurrency.lockutils [req-84fd97cf-b1d7-4f0b-8e2f-b129ed946e66 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:54.303 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 314d8ab1254742d681b7a45d59108ffc __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:54.303 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 314d8ab1254742d681b7a45d59108ffc __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:54.303 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 314d8ab1254742d681b7a45d59108ffc __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:54.304 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 314d8ab1254742d681b7a45d59108ffc __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:54.304 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:54.304 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:54.304 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 314d8ab1254742d681b7a45d59108ffc poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:54.304 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:54.304 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 314d8ab1254742d681b7a45d59108ffc poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:54.304 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:54.304 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:54.304 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 314d8ab1254742d681b7a45d59108ffc poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:54.304 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 314d8ab1254742d681b7a45d59108ffc poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:54.304 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:54.304 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:54.304 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:54.304 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:54.304 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:54.305 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:54.305 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:54.307 134146 DEBUG oslo_concurrency.lockutils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:54.307 134145 DEBUG oslo_concurrency.lockutils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:54.307 134140 DEBUG oslo_concurrency.lockutils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:54.307 134146 DEBUG oslo_concurrency.lockutils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:54.307 134145 DEBUG oslo_concurrency.lockutils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:54.308 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:54.308 134140 DEBUG oslo_concurrency.lockutils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:54.308 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:54.308 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:54.308 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:54.308 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:54.308 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:54.309 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:54.309 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:54.309 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:54.310 134138 DEBUG oslo_concurrency.lockutils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:54.310 134138 DEBUG oslo_concurrency.lockutils [req-360b56d7-7c75-4e39-bfab-11cc1af10cc4 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:54.311 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:54.311 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:54.311 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:55.222 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ecae7e05a0fa4aae94441be9acceb55a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:55.222 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ecae7e05a0fa4aae94441be9acceb55a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:55.222 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ecae7e05a0fa4aae94441be9acceb55a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:55.223 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:55.223 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ecae7e05a0fa4aae94441be9acceb55a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:55.223 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:55.223 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:55.223 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:55.223 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:55.223 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ecae7e05a0fa4aae94441be9acceb55a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:55.223 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ecae7e05a0fa4aae94441be9acceb55a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:55.224 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:55.224 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:55.224 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:55.224 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:55.226 134145 DEBUG oslo_concurrency.lockutils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:55.226 134145 DEBUG oslo_concurrency.lockutils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:55.227 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:55.227 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:55.227 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ecae7e05a0fa4aae94441be9acceb55a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:48:55.227 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:55.227 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:55.227 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ecae7e05a0fa4aae94441be9acceb55a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:48:55.228 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:55.228 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:55.228 134146 DEBUG oslo_concurrency.lockutils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:55.228 134146 DEBUG oslo_concurrency.lockutils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:55.228 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:55.229 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:55.229 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:55.230 134138 DEBUG oslo_concurrency.lockutils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:55.231 134138 DEBUG oslo_concurrency.lockutils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:55.231 134140 DEBUG oslo_concurrency.lockutils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:48:55.232 134140 DEBUG oslo_concurrency.lockutils [req-2d503dd5-8289-45b7-abe0-6a6a653d001c a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:48:55.233 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:55.233 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:55.234 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:55.234 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:55.234 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:55.234 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:56.229 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:56.229 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:56.229 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:56.231 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:56.231 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:56.231 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:56.235 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:56.235 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:56.235 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:56.235 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:56.236 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:56.236 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:58.230 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:58.230 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:58.230 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:58.233 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:58.233 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:58.233 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:58.238 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:58.238 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:58.238 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:48:58.239 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:48:58.239 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:48:58.239 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:02.232 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:02.233 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:02.233 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:02.237 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:02.238 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:02.238 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:02.243 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:02.243 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:02.243 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:02.244 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:02.244 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:02.244 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:07.053 134145 DEBUG oslo_service.periodic_task [req-ad4b37b4-8971-4848-8bb0-959d4a6636f0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:49:07.057 134145 DEBUG oslo_concurrency.lockutils [req-ce642657-7c65-48c2-ae26-c9ac1989f2a4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:07.057 134145 DEBUG oslo_concurrency.lockutils [req-ce642657-7c65-48c2-ae26-c9ac1989f2a4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:10.236 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:10.237 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:10.237 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:10.243 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:10.243 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:10.243 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:10.248 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:10.248 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:10.249 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:10.249 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:10.250 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:10.250 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:12.074 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: b6609a44e32f4fe5a3a3fd0bb0b155d6 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:12.074 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: b6609a44e32f4fe5a3a3fd0bb0b155d6 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:12.075 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: b6609a44e32f4fe5a3a3fd0bb0b155d6 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:12.075 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.075 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.075 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b6609a44e32f4fe5a3a3fd0bb0b155d6 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:12.075 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b6609a44e32f4fe5a3a3fd0bb0b155d6 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:12.075 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.075 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.075 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b6609a44e32f4fe5a3a3fd0bb0b155d6 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:12.075 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.075 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:12.075 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:12.075 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.076 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:12.076 134146 DEBUG oslo_concurrency.lockutils [req-47712ab7-67ad-476f-aafc-a34cf33b84d6 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:12.076 134145 DEBUG oslo_concurrency.lockutils [req-47712ab7-67ad-476f-aafc-a34cf33b84d6 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:12.076 134146 DEBUG oslo_concurrency.lockutils [req-47712ab7-67ad-476f-aafc-a34cf33b84d6 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:12.076 134145 DEBUG oslo_concurrency.lockutils [req-47712ab7-67ad-476f-aafc-a34cf33b84d6 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:12.076 134138 DEBUG oslo_concurrency.lockutils [req-47712ab7-67ad-476f-aafc-a34cf33b84d6 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:12.076 134138 DEBUG oslo_concurrency.lockutils [req-47712ab7-67ad-476f-aafc-a34cf33b84d6 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:12.077 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: b6609a44e32f4fe5a3a3fd0bb0b155d6 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:12.077 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.077 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b6609a44e32f4fe5a3a3fd0bb0b155d6 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:12.078 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.078 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:12.078 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.078 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:12.078 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:12.078 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.078 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:12.078 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:12.079 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.079 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:12.079 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:12.080 134140 DEBUG oslo_concurrency.lockutils [req-47712ab7-67ad-476f-aafc-a34cf33b84d6 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:12.080 134140 DEBUG oslo_concurrency.lockutils [req-47712ab7-67ad-476f-aafc-a34cf33b84d6 a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:12.081 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:12.081 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.081 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:12.135 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9756798ab601488097186235d9821b99 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:12.135 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9756798ab601488097186235d9821b99 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:12.136 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.136 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9756798ab601488097186235d9821b99 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:12.136 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.136 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9756798ab601488097186235d9821b99 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:12.136 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.136 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:12.136 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.136 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:12.137 134146 DEBUG oslo_concurrency.lockutils [req-e0203f22-5009-4153-8711-afb71d3738ab a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:12.137 134146 DEBUG oslo_concurrency.lockutils [req-e0203f22-5009-4153-8711-afb71d3738ab a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:12.139 134145 DEBUG oslo_concurrency.lockutils [req-e0203f22-5009-4153-8711-afb71d3738ab a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:12.139 134145 DEBUG oslo_concurrency.lockutils [req-e0203f22-5009-4153-8711-afb71d3738ab a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:12.140 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:12.140 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.140 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:12.140 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:12.140 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.139 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9756798ab601488097186235d9821b99 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:12.140 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:12.141 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.141 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9756798ab601488097186235d9821b99 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:12.141 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.142 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:12.143 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9756798ab601488097186235d9821b99 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:12.143 134140 DEBUG oslo_concurrency.lockutils [req-e0203f22-5009-4153-8711-afb71d3738ab a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:12.143 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.143 134140 DEBUG oslo_concurrency.lockutils [req-e0203f22-5009-4153-8711-afb71d3738ab a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:12.143 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9756798ab601488097186235d9821b99 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:12.144 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:12.144 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.149 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.150 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:12.150 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:12.151 134138 DEBUG oslo_concurrency.lockutils [req-e0203f22-5009-4153-8711-afb71d3738ab a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:12.151 134138 DEBUG oslo_concurrency.lockutils [req-e0203f22-5009-4153-8711-afb71d3738ab a09f16937f634b5a85c8b00eb830d566 50feff4feb574037aefd0bf043ed97fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:12.152 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:12.153 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:12.153 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:13.141 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:13.141 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:13.141 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:13.142 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:13.142 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:13.143 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:13.152 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:13.152 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:13.152 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:13.154 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:13.154 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:13.154 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:15.143 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:15.143 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:15.144 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:15.145 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:15.145 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:15.146 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:15.154 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:15.154 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:15.154 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:15.157 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:15.157 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:15.157 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:19.069 134146 DEBUG oslo_service.periodic_task [req-f6512447-05bc-4c7b-8912-33725562e2c2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:49:19.074 134146 DEBUG oslo_concurrency.lockutils [req-9a13ac3c-1f00-4368-be43-fcd34761e394 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:19.074 134146 DEBUG oslo_concurrency.lockutils [req-9a13ac3c-1f00-4368-be43-fcd34761e394 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:19.146 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:19.146 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:19.147 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:19.147 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:19.147 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:19.147 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:19.157 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:19.158 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:19.158 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:19.162 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:19.162 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:19.162 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:20.074 134138 DEBUG oslo_service.periodic_task [req-644cce0e-dadb-435c-b969-e25096dc502e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:49:20.078 134138 DEBUG oslo_concurrency.lockutils [req-0a97295a-a68a-4296-9907-cb7c488347c0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:20.079 134138 DEBUG oslo_concurrency.lockutils [req-0a97295a-a68a-4296-9907-cb7c488347c0 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:25.052 134140 DEBUG oslo_service.periodic_task [req-84fd97cf-b1d7-4f0b-8e2f-b129ed946e66 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:49:25.060 134140 DEBUG oslo_concurrency.lockutils [req-bf4fa6f7-00a9-4f83-a829-7e8c6384797e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:25.060 134140 DEBUG oslo_concurrency.lockutils [req-bf4fa6f7-00a9-4f83-a829-7e8c6384797e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:27.148 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:27.149 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:27.149 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:27.149 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:27.149 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:27.149 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:27.160 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:27.161 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:27.161 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:27.163 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:27.163 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:27.164 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:27.573 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 383be72440344a039049325109b24f3f reply to reply_196060a39108420a873d953cb9a1134e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:49:27.573 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:27.573 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f2d18af8d8634f92848526f00c8d6fbc poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:27.574 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:27.574 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:27.576 134145 DEBUG nova.scheduler.manager [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['2db7444a-2614-4cdd-933f-af80442ab291'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:49:27.578 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:27.578 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:27.578 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:27.583 134145 DEBUG nova.scheduler.request_filter [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:49:27.583 134145 DEBUG nova.scheduler.request_filter [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:49:27.583 134145 DEBUG nova.scheduler.request_filter [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:49:27.584 134145 DEBUG nova.scheduler.request_filter [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:49:27.585 134145 DEBUG nova.scheduler.request_filter [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:49:27.591 134145 DEBUG oslo_concurrency.lockutils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:27.592 134145 DEBUG oslo_concurrency.lockutils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:27.658 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 5ea0fc15e920456198a62a566525b7aa NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:49:27.660 134145 DEBUG oslo_concurrency.lockutils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:27.661 134145 DEBUG oslo_concurrency.lockutils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:27.674 134145 DEBUG oslo_concurrency.lockutils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:27.675 134145 DEBUG nova.scheduler.host_manager [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_50feff4feb574037aefd0bf043ed97fb='0',num_proj_bbe543f6c8ad405a838c992bdd870466='0',num_task_None='0',num_vm_active='0',num_vm_building='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:49:12Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:49:27.676 134145 DEBUG nova.scheduler.host_manager [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:49:27.676 134145 DEBUG nova.scheduler.host_manager [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 402, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 49, 22, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 49, 22, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:49:27.676 134145 DEBUG nova.scheduler.host_manager [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:49:27.676 134145 DEBUG oslo_concurrency.lockutils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:27.677 134145 INFO nova.scheduler.host_manager [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:49:27.677 134145 DEBUG nova.scheduler.manager [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:49:27.678 134145 DEBUG nova.scheduler.manager [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:49:27.678 134145 DEBUG nova.scheduler.utils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance 2db7444a-2614-4cdd-933f-af80442ab291 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:49:27.793 134145 DEBUG nova.scheduler.manager [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: 2db7444a-2614-4cdd-933f-af80442ab291] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:49:27.794 134145 DEBUG oslo_concurrency.lockutils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:27.794 134145 DEBUG oslo_concurrency.lockutils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:27.796 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 9bad9d7ff58d4bb3889298ac51a75d4c NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:49:27.797 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 383be72440344a039049325109b24f3f reply queue: reply_196060a39108420a873d953cb9a1134e time elapsed: 0.22396290300002875s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:49:28.581 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:28.581 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:28.581 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:29.281 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 6899c0b085b94fb9a1a1924a0b91ffb2 reply to reply_85578e9317f24a84af7e369b8d783622 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:49:29.281 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:29.281 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: a61ff166f562494fbac4b6a5a7941daa poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:29.282 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:29.282 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:29.283 134140 DEBUG nova.scheduler.manager [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['df5c20da-e898-42cf-9486-8170f23c3f37'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:49:29.285 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:29.286 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:29.286 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:29.290 134140 DEBUG nova.scheduler.request_filter [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:49:29.290 134140 DEBUG nova.scheduler.request_filter [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:49:29.291 134140 DEBUG nova.scheduler.request_filter [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:49:29.291 134140 DEBUG nova.scheduler.request_filter [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:49:29.291 134140 DEBUG nova.scheduler.request_filter [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:49:29.296 134140 DEBUG oslo_concurrency.lockutils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:29.296 134140 DEBUG oslo_concurrency.lockutils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:29.436 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 48bb88389f2a451194eeafa9be47ff41 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:49:29.438 134140 DEBUG oslo_concurrency.lockutils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:29.438 134140 DEBUG oslo_concurrency.lockutils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:29.451 134140 DEBUG oslo_concurrency.lockutils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:29.451 134140 DEBUG nova.scheduler.host_manager [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_e4a006b39b3d4bbfad8d946d54c83f16='1',num_task_None='1',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:49:28Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:49:29.452 134140 DEBUG nova.scheduler.host_manager [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:49:29.452 134140 DEBUG nova.scheduler.host_manager [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 402, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 49, 22, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 49, 22, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:49:29.453 134140 DEBUG nova.scheduler.host_manager [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:49:29.453 134140 DEBUG oslo_concurrency.lockutils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:29.453 134140 INFO nova.scheduler.host_manager [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:49:29.453 134140 DEBUG nova.scheduler.manager [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:49:29.454 134140 DEBUG nova.scheduler.manager [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:49:29.454 134140 DEBUG nova.scheduler.utils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance df5c20da-e898-42cf-9486-8170f23c3f37 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:49:29.578 134140 DEBUG nova.scheduler.manager [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: df5c20da-e898-42cf-9486-8170f23c3f37] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:49:29.578 134140 DEBUG oslo_concurrency.lockutils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:29.579 134140 DEBUG oslo_concurrency.lockutils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:29.580 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: b43df2b6b7a241aaa1c216b7f01efd07 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:49:29.582 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 6899c0b085b94fb9a1a1924a0b91ffb2 reply queue: reply_85578e9317f24a84af7e369b8d783622 time elapsed: 0.3009193090001645s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:49:30.288 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:30.288 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:30.288 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:30.583 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:30.584 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:30.584 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:32.290 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:32.291 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:32.291 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:32.304 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5f132e4a52524f05b21e631e5031c1ad __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:32.304 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5f132e4a52524f05b21e631e5031c1ad __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:32.304 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5f132e4a52524f05b21e631e5031c1ad __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:32.305 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:32.305 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:32.305 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5f132e4a52524f05b21e631e5031c1ad poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:32.305 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5f132e4a52524f05b21e631e5031c1ad poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:32.305 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:32.305 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:32.305 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:32.305 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5f132e4a52524f05b21e631e5031c1ad poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:32.305 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:32.305 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:32.305 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:32.305 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:32.308 134140 DEBUG oslo_concurrency.lockutils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:32.308 134146 DEBUG oslo_concurrency.lockutils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:32.308 134138 DEBUG oslo_concurrency.lockutils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:32.308 134140 DEBUG oslo_concurrency.lockutils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:32.308 134146 DEBUG oslo_concurrency.lockutils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:32.308 134138 DEBUG oslo_concurrency.lockutils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:32.308 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:32.308 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:32.309 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:32.309 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:32.309 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:32.309 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:32.309 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:32.309 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:32.309 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:32.311 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5f132e4a52524f05b21e631e5031c1ad __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:32.312 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:32.313 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5f132e4a52524f05b21e631e5031c1ad poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:32.313 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:32.313 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:32.315 134145 DEBUG oslo_concurrency.lockutils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:32.315 134145 DEBUG oslo_concurrency.lockutils [req-6b234bb2-1a12-4fb7-8298-f872f74f3dec 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:32.317 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:32.317 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:32.317 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:33.150 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: d54a7f7d92b54ad4bbaed49b5be89ea2 reply to reply_196060a39108420a873d953cb9a1134e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:49:33.150 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:33.150 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 66cf73b852554ff28da710c5fa0ec076 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:33.150 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:33.150 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:33.154 134146 DEBUG nova.scheduler.manager [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['edd42326-7018-4e8c-b78d-532f13cc9517'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:49:33.158 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:33.159 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:33.159 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:33.165 134146 DEBUG nova.scheduler.request_filter [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:49:33.166 134146 DEBUG nova.scheduler.request_filter [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:49:33.166 134146 DEBUG nova.scheduler.request_filter [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:49:33.166 134146 DEBUG nova.scheduler.request_filter [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:49:33.167 134146 DEBUG nova.scheduler.request_filter [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:49:33.172 134146 DEBUG oslo_concurrency.lockutils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:33.172 134146 DEBUG oslo_concurrency.lockutils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:33.237 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 051a17c059774b45929d567125d2432a NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:49:33.239 134146 DEBUG oslo_concurrency.lockutils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:33.239 134146 DEBUG oslo_concurrency.lockutils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:33.248 134146 DEBUG oslo_concurrency.lockutils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:33.248 134146 DEBUG nova.scheduler.host_manager [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=75,free_ram_mb=29339,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=2,mapped=1,memory_mb=31899,memory_mb_used=2560,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=2,service_id=None,stats={failed_builds='0',io_workload='2',num_instances='2',num_os_type_None='2',num_proj_e4a006b39b3d4bbfad8d946d54c83f16='2',num_task_None='2',num_vm_building='2'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:49:30Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=2) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:49:33.249 134146 DEBUG nova.scheduler.host_manager [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:49:33.249 134146 DEBUG nova.scheduler.host_manager [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 403, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 49, 32, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 49, 32, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:49:33.249 134146 DEBUG nova.scheduler.host_manager [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: ['2db7444a-2614-4cdd-933f-af80442ab291'] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:49:33.250 134146 DEBUG oslo_concurrency.lockutils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:33.251 134146 INFO nova.scheduler.host_manager [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:49:33.251 134146 DEBUG nova.scheduler.manager [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 29339MB disk: 26624MB io_ops: 2 instances: 2]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:49:33.251 134146 DEBUG nova.scheduler.manager [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 29339MB disk: 26624MB io_ops: 2 instances: 2, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:49:33.251 134146 DEBUG nova.scheduler.utils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance edd42326-7018-4e8c-b78d-532f13cc9517 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:49:33.276 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0a9efd142c4d48c0b583c08cbcd831fe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:33.276 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:33.276 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0a9efd142c4d48c0b583c08cbcd831fe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:33.276 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0a9efd142c4d48c0b583c08cbcd831fe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:33.276 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:33.276 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:33.277 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:33.277 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0a9efd142c4d48c0b583c08cbcd831fe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:33.277 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:33.277 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:33.279 134146 DEBUG oslo_concurrency.lockutils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:33.281 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0a9efd142c4d48c0b583c08cbcd831fe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:33.290 134146 DEBUG oslo_concurrency.lockutils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.011s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:33.290 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:33.282 134145 DEBUG oslo_concurrency.lockutils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:33.282 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0a9efd142c4d48c0b583c08cbcd831fe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:33.291 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:33.292 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0a9efd142c4d48c0b583c08cbcd831fe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:33.292 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:33.292 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:33.292 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:33.293 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:33.293 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0a9efd142c4d48c0b583c08cbcd831fe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:33.293 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:33.293 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:33.293 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:33.296 134138 DEBUG oslo_concurrency.lockutils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:33.297 134138 DEBUG oslo_concurrency.lockutils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:33.298 134140 DEBUG oslo_concurrency.lockutils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:33.297 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:33.301 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:33.303 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:33.301 134140 DEBUG oslo_concurrency.lockutils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:33.306 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:33.307 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:33.307 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:33.301 134145 DEBUG oslo_concurrency.lockutils [req-96b7f823-a537-4849-8c03-b6e978d9e1e8 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.020s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:33.308 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:33.310 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:33.310 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:33.333 134146 DEBUG nova.scheduler.manager [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: edd42326-7018-4e8c-b78d-532f13cc9517] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 29339MB disk: 26624MB io_ops: 2 instances: 2 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:49:33.334 134146 DEBUG oslo_concurrency.lockutils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:33.334 134146 DEBUG oslo_concurrency.lockutils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:33.335 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: cb20472a952f4e5ab9c2cd7c76fc987d NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:49:33.338 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: d54a7f7d92b54ad4bbaed49b5be89ea2 reply queue: reply_196060a39108420a873d953cb9a1134e time elapsed: 0.1885281249997206s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:49:34.295 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:34.295 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:34.295 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:34.307 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:34.308 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:34.308 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:34.308 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:34.308 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:34.308 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:34.313 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:34.313 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:34.314 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:36.297 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:36.297 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:36.297 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:36.309 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:36.309 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:36.310 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:36.310 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:36.310 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:36.310 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:36.315 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:36.316 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:36.316 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:37.062 134145 DEBUG oslo_service.periodic_task [req-ce642657-7c65-48c2-ae26-c9ac1989f2a4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:49:37.066 134145 DEBUG oslo_concurrency.lockutils [req-f33488af-92ac-458e-9075-af5e8ba8acc3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:37.066 134145 DEBUG oslo_concurrency.lockutils [req-f33488af-92ac-458e-9075-af5e8ba8acc3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:39.369 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: dd8a37a3b4684a50af5095af832a015b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:39.369 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: dd8a37a3b4684a50af5095af832a015b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:39.369 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: dd8a37a3b4684a50af5095af832a015b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:39.370 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:39.370 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:39.370 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: dd8a37a3b4684a50af5095af832a015b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:39.370 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:39.370 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: dd8a37a3b4684a50af5095af832a015b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:39.370 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: dd8a37a3b4684a50af5095af832a015b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:39.370 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:39.370 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:39.370 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:39.370 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:39.370 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:39.370 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:39.372 134146 DEBUG oslo_concurrency.lockutils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:39.372 134140 DEBUG oslo_concurrency.lockutils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:39.372 134146 DEBUG oslo_concurrency.lockutils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:39.372 134138 DEBUG oslo_concurrency.lockutils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:39.373 134138 DEBUG oslo_concurrency.lockutils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:39.373 134140 DEBUG oslo_concurrency.lockutils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:39.373 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:39.374 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:39.374 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:39.374 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: dd8a37a3b4684a50af5095af832a015b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:49:39.373 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:39.373 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:39.375 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:39.375 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:39.375 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:39.375 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:39.376 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:39.376 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: dd8a37a3b4684a50af5095af832a015b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:49:39.376 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:39.377 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:39.380 134145 DEBUG oslo_concurrency.lockutils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:39.381 134145 DEBUG oslo_concurrency.lockutils [req-abb7055f-915a-482e-b9fc-ca9d2500e54c 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:39.381 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:39.381 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:39.381 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:40.376 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:40.376 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:40.377 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:40.377 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:40.377 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:40.378 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:40.378 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:40.378 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:40.378 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:40.383 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:40.383 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:40.383 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:42.378 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:42.378 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:42.379 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:42.379 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:42.380 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:42.380 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:42.380 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:42.381 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:42.381 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:42.385 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:42.385 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:42.385 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:46.380 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:46.381 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:46.381 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:46.382 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:46.382 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:46.382 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:46.382 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:46.383 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:46.383 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:46.386 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:46.387 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:46.387 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:50.069 134146 DEBUG oslo_service.periodic_task [req-9a13ac3c-1f00-4368-be43-fcd34761e394 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:49:50.073 134146 DEBUG oslo_concurrency.lockutils [req-cfb7edb1-8824-4c2b-b7e8-6b03f8d83732 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:50.073 134146 DEBUG oslo_concurrency.lockutils [req-cfb7edb1-8824-4c2b-b7e8-6b03f8d83732 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:50.084 134138 DEBUG oslo_service.periodic_task [req-0a97295a-a68a-4296-9907-cb7c488347c0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:49:50.088 134138 DEBUG oslo_concurrency.lockutils [req-b0001570-dd9f-47c9-8738-f8c25d9908d5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:50.088 134138 DEBUG oslo_concurrency.lockutils [req-b0001570-dd9f-47c9-8738-f8c25d9908d5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:49:54.384 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:54.384 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:54.384 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:54.384 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:54.384 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:54.384 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:54.391 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:54.391 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:54.391 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:54.394 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:49:54.396 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:49:54.396 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:49:56.052 134140 DEBUG oslo_service.periodic_task [req-bf4fa6f7-00a9-4f83-a829-7e8c6384797e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:49:56.056 134140 DEBUG oslo_concurrency.lockutils [req-9a835894-ff19-42af-9e29-1864b2466db6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:49:56.056 134140 DEBUG oslo_concurrency.lockutils [req-9a835894-ff19-42af-9e29-1864b2466db6 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:07.071 134145 DEBUG oslo_service.periodic_task [req-f33488af-92ac-458e-9075-af5e8ba8acc3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:50:07.075 134145 DEBUG oslo_concurrency.lockutils [req-833fd3fd-cd86-45c1-8486-72ea6c188ec3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:07.076 134145 DEBUG oslo_concurrency.lockutils [req-833fd3fd-cd86-45c1-8486-72ea6c188ec3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:10.386 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:10.386 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:10.386 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:10.386 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:10.386 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:10.386 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:10.393 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:10.393 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:10.393 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:10.398 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:10.398 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:10.398 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:20.079 134146 DEBUG oslo_service.periodic_task [req-cfb7edb1-8824-4c2b-b7e8-6b03f8d83732 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:50:20.083 134146 DEBUG oslo_concurrency.lockutils [req-0e55599d-664a-446d-bc57-e3307688eb81 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:20.083 134146 DEBUG oslo_concurrency.lockutils [req-0e55599d-664a-446d-bc57-e3307688eb81 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:20.094 134138 DEBUG oslo_service.periodic_task [req-b0001570-dd9f-47c9-8738-f8c25d9908d5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:50:20.099 134138 DEBUG oslo_concurrency.lockutils [req-dcc1f531-4b84-45d1-a1f8-7e7e625a877f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:20.099 134138 DEBUG oslo_concurrency.lockutils [req-dcc1f531-4b84-45d1-a1f8-7e7e625a877f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:27.052 134140 DEBUG oslo_service.periodic_task [req-9a835894-ff19-42af-9e29-1864b2466db6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:50:27.056 134140 DEBUG oslo_concurrency.lockutils [req-c3183001-f810-447d-b886-b598e20d99a5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:27.056 134140 DEBUG oslo_concurrency.lockutils [req-c3183001-f810-447d-b886-b598e20d99a5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:34.447 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1d727367bfef4242a596040436300f42 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:34.447 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1d727367bfef4242a596040436300f42 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:34.447 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1d727367bfef4242a596040436300f42 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:34.448 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:34.448 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:34.448 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:34.448 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1d727367bfef4242a596040436300f42 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:34.448 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1d727367bfef4242a596040436300f42 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:34.448 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1d727367bfef4242a596040436300f42 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:34.448 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:34.448 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:34.448 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:34.448 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:34.448 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:34.448 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:34.449 134140 DEBUG oslo_concurrency.lockutils [req-759985ba-490e-4811-a47c-481d1521b662 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:34.449 134145 DEBUG oslo_concurrency.lockutils [req-759985ba-490e-4811-a47c-481d1521b662 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:34.449 134138 DEBUG oslo_concurrency.lockutils [req-759985ba-490e-4811-a47c-481d1521b662 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:34.449 134140 DEBUG nova.scheduler.host_manager [req-759985ba-490e-4811-a47c-481d1521b662 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:50:34.449 134145 DEBUG nova.scheduler.host_manager [req-759985ba-490e-4811-a47c-481d1521b662 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:50:34.449 134138 DEBUG nova.scheduler.host_manager [req-759985ba-490e-4811-a47c-481d1521b662 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:50:34.449 134140 DEBUG oslo_concurrency.lockutils [req-759985ba-490e-4811-a47c-481d1521b662 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:34.449 134145 DEBUG oslo_concurrency.lockutils [req-759985ba-490e-4811-a47c-481d1521b662 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:34.449 134138 DEBUG oslo_concurrency.lockutils [req-759985ba-490e-4811-a47c-481d1521b662 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:34.450 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:34.450 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:34.450 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:34.450 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:34.450 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:34.450 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:34.451 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:34.451 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1d727367bfef4242a596040436300f42 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:34.451 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:34.451 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:34.451 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:34.451 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1d727367bfef4242a596040436300f42 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:34.452 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:34.452 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:34.452 134146 DEBUG oslo_concurrency.lockutils [req-759985ba-490e-4811-a47c-481d1521b662 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:34.453 134146 DEBUG nova.scheduler.host_manager [req-759985ba-490e-4811-a47c-481d1521b662 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:50:34.453 134146 DEBUG oslo_concurrency.lockutils [req-759985ba-490e-4811-a47c-481d1521b662 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:34.453 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:34.453 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:34.453 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:35.450 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:35.451 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:35.451 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:35.451 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:35.451 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:35.452 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:35.452 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:35.452 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:35.453 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:35.454 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:35.454 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:35.455 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:37.080 134145 DEBUG oslo_service.periodic_task [req-833fd3fd-cd86-45c1-8486-72ea6c188ec3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:50:37.084 134145 DEBUG oslo_concurrency.lockutils [req-02414845-fe83-48d9-a66f-33f152b17005 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:37.085 134145 DEBUG oslo_concurrency.lockutils [req-02414845-fe83-48d9-a66f-33f152b17005 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:37.453 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:37.453 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:37.453 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:37.454 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:37.454 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:37.454 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:37.454 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:37.454 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:37.454 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:37.457 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:37.457 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:37.458 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:41.457 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:41.457 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:41.457 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:41.458 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:41.458 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:41.458 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:41.458 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:41.459 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:41.459 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:41.462 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:41.462 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:41.462 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:43.609 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0cdaaab2678243cb9a04b0598bad610a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:43.609 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0cdaaab2678243cb9a04b0598bad610a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:43.609 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0cdaaab2678243cb9a04b0598bad610a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:43.610 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.610 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0cdaaab2678243cb9a04b0598bad610a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:43.610 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.610 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0cdaaab2678243cb9a04b0598bad610a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:43.610 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.610 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.610 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:43.610 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0cdaaab2678243cb9a04b0598bad610a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:43.610 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.610 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.610 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:43.611 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:43.611 134138 DEBUG oslo_concurrency.lockutils [req-2cb5d7b3-2228-4cab-bf48-48a0af4f6f5f 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:43.611 134138 DEBUG oslo_concurrency.lockutils [req-2cb5d7b3-2228-4cab-bf48-48a0af4f6f5f 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:43.611 134145 DEBUG oslo_concurrency.lockutils [req-2cb5d7b3-2228-4cab-bf48-48a0af4f6f5f 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:43.611 134140 DEBUG oslo_concurrency.lockutils [req-2cb5d7b3-2228-4cab-bf48-48a0af4f6f5f 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:43.611 134145 DEBUG oslo_concurrency.lockutils [req-2cb5d7b3-2228-4cab-bf48-48a0af4f6f5f 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:43.612 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0cdaaab2678243cb9a04b0598bad610a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:43.612 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:43.613 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.613 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:43.613 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:43.613 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.613 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.613 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:43.613 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0cdaaab2678243cb9a04b0598bad610a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:43.613 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.613 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:43.614 134146 DEBUG oslo_concurrency.lockutils [req-2cb5d7b3-2228-4cab-bf48-48a0af4f6f5f 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:43.615 134146 DEBUG oslo_concurrency.lockutils [req-2cb5d7b3-2228-4cab-bf48-48a0af4f6f5f 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:43.611 134140 DEBUG oslo_concurrency.lockutils [req-2cb5d7b3-2228-4cab-bf48-48a0af4f6f5f 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:43.616 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:43.616 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.616 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:43.616 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:43.616 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.617 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:43.683 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 80c531fc10c14b0fb02d544c68302964 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:43.683 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 80c531fc10c14b0fb02d544c68302964 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:43.683 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 80c531fc10c14b0fb02d544c68302964 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:43.684 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.684 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.684 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 80c531fc10c14b0fb02d544c68302964 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:43.684 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 80c531fc10c14b0fb02d544c68302964 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:43.684 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.684 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.684 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 80c531fc10c14b0fb02d544c68302964 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:43.684 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.684 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:43.684 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:43.684 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.685 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:43.685 134138 DEBUG oslo_concurrency.lockutils [req-469ae74f-eec3-4732-8de5-d7af70512b6e 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:43.685 134140 DEBUG oslo_concurrency.lockutils [req-469ae74f-eec3-4732-8de5-d7af70512b6e 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:43.685 134138 DEBUG oslo_concurrency.lockutils [req-469ae74f-eec3-4732-8de5-d7af70512b6e 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:43.685 134140 DEBUG oslo_concurrency.lockutils [req-469ae74f-eec3-4732-8de5-d7af70512b6e 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:43.685 134145 DEBUG oslo_concurrency.lockutils [req-469ae74f-eec3-4732-8de5-d7af70512b6e 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:43.686 134145 DEBUG oslo_concurrency.lockutils [req-469ae74f-eec3-4732-8de5-d7af70512b6e 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:43.686 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:43.686 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.686 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 80c531fc10c14b0fb02d544c68302964 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:43.686 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:43.687 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.687 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:43.687 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 80c531fc10c14b0fb02d544c68302964 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:43.687 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.687 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:43.687 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.687 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:43.688 134146 DEBUG oslo_concurrency.lockutils [req-469ae74f-eec3-4732-8de5-d7af70512b6e 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:43.688 134146 DEBUG oslo_concurrency.lockutils [req-469ae74f-eec3-4732-8de5-d7af70512b6e 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:43.688 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:43.689 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.689 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:43.690 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:43.690 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:43.690 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:44.241 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2bc883887fc343e280d8be0ad41ab3d2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:44.241 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2bc883887fc343e280d8be0ad41ab3d2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:44.241 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:44.241 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2bc883887fc343e280d8be0ad41ab3d2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:44.241 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2bc883887fc343e280d8be0ad41ab3d2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:44.241 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:44.241 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2bc883887fc343e280d8be0ad41ab3d2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:44.241 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:44.241 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:44.242 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:44.242 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:44.242 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2bc883887fc343e280d8be0ad41ab3d2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:44.242 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:44.242 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:44.242 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:44.242 134146 DEBUG oslo_concurrency.lockutils [req-ae2ee380-0241-47f8-b254-0dce0e56a435 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:44.242 134140 DEBUG oslo_concurrency.lockutils [req-ae2ee380-0241-47f8-b254-0dce0e56a435 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:44.242 134146 DEBUG oslo_concurrency.lockutils [req-ae2ee380-0241-47f8-b254-0dce0e56a435 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:44.243 134140 DEBUG oslo_concurrency.lockutils [req-ae2ee380-0241-47f8-b254-0dce0e56a435 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:44.244 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:44.244 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:44.244 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:44.244 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:44.244 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:44.244 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:44.245 134138 DEBUG oslo_concurrency.lockutils [req-ae2ee380-0241-47f8-b254-0dce0e56a435 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:44.245 134138 DEBUG oslo_concurrency.lockutils [req-ae2ee380-0241-47f8-b254-0dce0e56a435 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:44.246 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:44.247 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:44.247 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:44.248 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2bc883887fc343e280d8be0ad41ab3d2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:44.248 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:44.248 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2bc883887fc343e280d8be0ad41ab3d2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:44.248 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:44.249 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:44.250 134145 DEBUG oslo_concurrency.lockutils [req-ae2ee380-0241-47f8-b254-0dce0e56a435 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:44.250 134145 DEBUG oslo_concurrency.lockutils [req-ae2ee380-0241-47f8-b254-0dce0e56a435 3b633b8b284040b882116450d1d6a530 e4a006b39b3d4bbfad8d946d54c83f16 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:44.251 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:44.251 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:44.251 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:45.245 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:45.245 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:45.246 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:45.246 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:45.246 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:45.246 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:45.249 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:45.249 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:45.249 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:45.253 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:45.253 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:45.253 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:47.248 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:47.248 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:47.249 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:47.249 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:47.249 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:47.249 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:47.252 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:47.253 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:47.253 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:47.255 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:47.255 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:47.256 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:50.088 134146 DEBUG oslo_service.periodic_task [req-0e55599d-664a-446d-bc57-e3307688eb81 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:50:50.092 134146 DEBUG oslo_concurrency.lockutils [req-a5115e6b-448a-4b98-a438-4ea5bc311770 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:50.093 134146 DEBUG oslo_concurrency.lockutils [req-a5115e6b-448a-4b98-a438-4ea5bc311770 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:50.103 134138 DEBUG oslo_service.periodic_task [req-dcc1f531-4b84-45d1-a1f8-7e7e625a877f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:50:50.107 134138 DEBUG oslo_concurrency.lockutils [req-73925287-8978-4e66-b099-6d9c7c7a62d9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:50.107 134138 DEBUG oslo_concurrency.lockutils [req-73925287-8978-4e66-b099-6d9c7c7a62d9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:51.249 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:51.250 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:51.250 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:51.251 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:51.251 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:51.251 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:51.255 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:51.255 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:51.255 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:51.258 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:51.258 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:51.258 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:55.149 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 85597c08a4df489a9c64fdeef83bb700 reply to reply_b20bc86560af44159724b4fea607f8c2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:50:55.149 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:55.150 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: a3be33d3616941c682bf71738c5f0dbc poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:55.150 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:55.150 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:55.152 134138 DEBUG nova.scheduler.manager [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['a04ac1f5-103c-4e54-b5fa-48c88bac3add'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:50:55.154 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:55.154 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:55.154 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:55.157 134138 DEBUG nova.scheduler.request_filter [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:50:55.157 134138 DEBUG nova.scheduler.request_filter [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:50:55.157 134138 DEBUG nova.scheduler.request_filter [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:50:55.157 134138 DEBUG nova.scheduler.request_filter [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:50:55.158 134138 DEBUG nova.scheduler.request_filter [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:50:55.165 134138 DEBUG oslo_concurrency.lockutils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:55.166 134138 DEBUG oslo_concurrency.lockutils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:55.624 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 24cb94d6f70a4252bba2ab4d5a407f74 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:50:55.626 134138 DEBUG oslo_concurrency.lockutils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:55.627 134138 DEBUG oslo_concurrency.lockutils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:55.636 134138 DEBUG oslo_concurrency.lockutils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:55.636 134138 DEBUG nova.scheduler.host_manager [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=23,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_e4a006b39b3d4bbfad8d946d54c83f16='0',num_task_None='0',num_vm_active='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:50:44Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:50:55.637 134138 DEBUG nova.scheduler.host_manager [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:50:55.637 134138 DEBUG nova.scheduler.host_manager [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 411, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 50, 52, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 50, 52, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:50:55.638 134138 DEBUG nova.scheduler.host_manager [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:50:55.638 134138 DEBUG oslo_concurrency.lockutils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:55.638 134138 INFO nova.scheduler.host_manager [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:50:55.638 134138 DEBUG nova.scheduler.manager [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 23552MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:50:55.638 134138 DEBUG nova.scheduler.manager [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 23552MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:50:55.639 134138 DEBUG nova.scheduler.utils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance a04ac1f5-103c-4e54-b5fa-48c88bac3add claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:50:55.725 134138 DEBUG nova.scheduler.manager [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: a04ac1f5-103c-4e54-b5fa-48c88bac3add] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 23552MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:50:55.725 134138 DEBUG oslo_concurrency.lockutils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:55.726 134138 DEBUG oslo_concurrency.lockutils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:55.727 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 23b153c7d8bd421097dc3c45836bdecb NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:50:55.728 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 85597c08a4df489a9c64fdeef83bb700 reply queue: reply_b20bc86560af44159724b4fea607f8c2 time elapsed: 0.5788462869995783s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:50:56.155 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:56.155 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:56.155 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:57.060 134140 DEBUG oslo_service.periodic_task [req-c3183001-f810-447d-b886-b598e20d99a5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:50:57.064 134140 DEBUG oslo_concurrency.lockutils [req-08c1077f-07ac-4211-b0ce-7e42d716d54e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:57.065 134140 DEBUG oslo_concurrency.lockutils [req-08c1077f-07ac-4211-b0ce-7e42d716d54e - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:58.157 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:58.157 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:58.157 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:58.816 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fa87128267f44bf0b64f9022cd534cbe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:58.816 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fa87128267f44bf0b64f9022cd534cbe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:58.816 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:58.816 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:58.816 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fa87128267f44bf0b64f9022cd534cbe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:58.816 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fa87128267f44bf0b64f9022cd534cbe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:58.816 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:58.816 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:58.816 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:58.816 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:58.819 134145 DEBUG oslo_concurrency.lockutils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:58.819 134145 DEBUG oslo_concurrency.lockutils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:58.819 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fa87128267f44bf0b64f9022cd534cbe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:58.820 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:58.820 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:58.820 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fa87128267f44bf0b64f9022cd534cbe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:58.820 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:58.820 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:58.820 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:58.820 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:58.818 134146 DEBUG oslo_concurrency.lockutils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:58.822 134146 DEBUG oslo_concurrency.lockutils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:58.822 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:58.822 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:58.822 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:58.823 134140 DEBUG oslo_concurrency.lockutils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:58.824 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fa87128267f44bf0b64f9022cd534cbe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:50:58.824 134140 DEBUG oslo_concurrency.lockutils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:58.824 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:58.824 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:58.824 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fa87128267f44bf0b64f9022cd534cbe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:50:58.824 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:58.825 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:58.825 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:58.824 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:58.827 134138 DEBUG oslo_concurrency.lockutils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:50:58.827 134138 DEBUG oslo_concurrency.lockutils [req-c80368f8-d290-426a-9fa4-46433003d6ff 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:50:58.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:58.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:58.827 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:59.822 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:59.822 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:59.822 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:59.824 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:59.824 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:59.824 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:59.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:59.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:59.828 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:50:59.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:50:59.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:50:59.829 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:01.823 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:01.824 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:01.824 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:01.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:01.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:01.826 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:01.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:01.830 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:01.831 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:01.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:01.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:01.832 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:05.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:05.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:05.828 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:05.828 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:05.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:05.829 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:05.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:05.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:05.834 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:05.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:05.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:05.835 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:07.090 134145 DEBUG oslo_service.periodic_task [req-02414845-fe83-48d9-a66f-33f152b17005 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:51:07.094 134145 DEBUG oslo_concurrency.lockutils [req-3d0ffc2c-ab64-4a91-9b89-b79034dfb33b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:07.095 134145 DEBUG oslo_concurrency.lockutils [req-3d0ffc2c-ab64-4a91-9b89-b79034dfb33b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:13.836 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:13.836 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:13.836 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:13.836 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:13.836 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:13.836 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:13.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:13.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:13.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:13.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:13.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:13.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:19.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: d49a1f58eb2246a38dbfd0a0b0bfbb0a reply to reply_196060a39108420a873d953cb9a1134e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:51:19.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:19.825 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 38e78844b8794aa6992dba8dd0f40aef poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:51:19.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:19.826 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:19.828 134145 DEBUG nova.scheduler.manager [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['46c854aa-2d00-47fd-944a-0dab3bdd0a61'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:51:19.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:19.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:19.831 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:19.833 134145 DEBUG nova.scheduler.request_filter [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:51:19.834 134145 DEBUG nova.scheduler.request_filter [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:51:19.834 134145 DEBUG nova.scheduler.request_filter [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:51:19.834 134145 DEBUG nova.scheduler.request_filter [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:51:19.835 134145 DEBUG nova.scheduler.request_filter [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:51:19.840 134145 DEBUG oslo_concurrency.lockutils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:19.840 134145 DEBUG oslo_concurrency.lockutils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:19.926 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: c2ae989bae604eb2a5795647b65017a5 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:51:19.928 134145 DEBUG oslo_concurrency.lockutils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:19.928 134145 DEBUG oslo_concurrency.lockutils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:19.938 134145 DEBUG oslo_concurrency.lockutils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:19.938 134145 DEBUG nova.scheduler.host_manager [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=23,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_1f8f24a0e1684bffb76b2ddfbdc4eca6='1',num_proj_e4a006b39b3d4bbfad8d946d54c83f16='0',num_task_None='1',num_vm_active='0',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:50:56Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:51:19.939 134145 DEBUG nova.scheduler.host_manager [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:51:19.940 134145 DEBUG nova.scheduler.host_manager [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 413, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 51, 12, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 51, 12, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:51:19.941 134145 DEBUG nova.scheduler.host_manager [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: ['a04ac1f5-103c-4e54-b5fa-48c88bac3add'] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:51:19.941 134145 DEBUG oslo_concurrency.lockutils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:19.942 134145 INFO nova.scheduler.host_manager [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:51:19.942 134145 DEBUG nova.scheduler.manager [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 23552MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:51:19.942 134145 DEBUG nova.scheduler.manager [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 23552MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:51:19.942 134145 DEBUG nova.scheduler.utils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance 46c854aa-2d00-47fd-944a-0dab3bdd0a61 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:51:20.018 134145 DEBUG nova.scheduler.manager [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: 46c854aa-2d00-47fd-944a-0dab3bdd0a61] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 23552MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:51:20.019 134145 DEBUG oslo_concurrency.lockutils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:20.019 134145 DEBUG oslo_concurrency.lockutils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:20.021 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: aaeade76537c4b5c877145ff172110ce NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:51:20.023 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: d49a1f58eb2246a38dbfd0a0b0bfbb0a reply queue: reply_196060a39108420a873d953cb9a1134e time elapsed: 0.19751929599988216s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:51:20.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:20.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:20.834 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:21.020 134138 DEBUG oslo_service.periodic_task [req-73925287-8978-4e66-b099-6d9c7c7a62d9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:51:21.024 134138 DEBUG oslo_concurrency.lockutils [req-bce41ef0-2ebd-4b72-9b36-0e063a3ef3cc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:21.024 134138 DEBUG oslo_concurrency.lockutils [req-bce41ef0-2ebd-4b72-9b36-0e063a3ef3cc - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:21.069 134146 DEBUG oslo_service.periodic_task [req-a5115e6b-448a-4b98-a438-4ea5bc311770 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:51:21.073 134146 DEBUG oslo_concurrency.lockutils [req-ca2e4d59-63d1-4416-ad30-bb825cbe4af7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:21.074 134146 DEBUG oslo_concurrency.lockutils [req-ca2e4d59-63d1-4416-ad30-bb825cbe4af7 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:22.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:22.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:22.837 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:23.047 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2171adacb20446b5b1dbcde116a39f2f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:51:23.047 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2171adacb20446b5b1dbcde116a39f2f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:51:23.048 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:23.048 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:23.048 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2171adacb20446b5b1dbcde116a39f2f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:51:23.048 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2171adacb20446b5b1dbcde116a39f2f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:51:23.048 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:23.048 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:23.048 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:23.048 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:23.049 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2171adacb20446b5b1dbcde116a39f2f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:51:23.049 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:23.050 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2171adacb20446b5b1dbcde116a39f2f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:51:23.050 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:23.050 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:23.051 134140 DEBUG oslo_concurrency.lockutils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:23.051 134140 DEBUG oslo_concurrency.lockutils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:23.052 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:23.052 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:23.052 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:23.054 134146 DEBUG oslo_concurrency.lockutils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:23.055 134146 DEBUG oslo_concurrency.lockutils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:23.055 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:23.055 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:23.055 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:23.054 134138 DEBUG oslo_concurrency.lockutils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:23.055 134138 DEBUG oslo_concurrency.lockutils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:23.056 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:23.056 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:23.056 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:23.057 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2171adacb20446b5b1dbcde116a39f2f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:51:23.057 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:23.058 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2171adacb20446b5b1dbcde116a39f2f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:51:23.058 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:23.058 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:23.060 134145 DEBUG oslo_concurrency.lockutils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:23.060 134145 DEBUG oslo_concurrency.lockutils [req-0aee31d3-7d43-4509-bff0-a0b628ba36bd 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:23.062 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:23.063 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:23.063 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:24.053 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:24.054 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:24.054 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:24.056 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:24.057 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:24.057 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:24.058 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:24.058 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:24.058 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:24.064 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:24.065 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:24.065 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:26.057 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:26.057 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:26.057 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:26.059 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:26.059 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:26.059 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:26.059 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:26.060 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:26.060 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:26.066 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:26.066 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:26.066 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:28.053 134140 DEBUG oslo_service.periodic_task [req-08c1077f-07ac-4211-b0ce-7e42d716d54e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:51:28.056 134140 DEBUG oslo_concurrency.lockutils [req-8eda3e94-ad95-4014-996f-57d4aef20a38 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:28.057 134140 DEBUG oslo_concurrency.lockutils [req-8eda3e94-ad95-4014-996f-57d4aef20a38 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:30.060 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:30.061 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:30.061 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:30.062 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:30.062 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:30.062 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:30.062 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:30.062 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:30.062 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:30.068 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:30.069 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:30.069 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:37.101 134145 DEBUG oslo_service.periodic_task [req-3d0ffc2c-ab64-4a91-9b89-b79034dfb33b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:51:37.105 134145 DEBUG oslo_concurrency.lockutils [req-fda1e428-d73b-48df-a5e3-ddd89b092f81 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:37.105 134145 DEBUG oslo_concurrency.lockutils [req-fda1e428-d73b-48df-a5e3-ddd89b092f81 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:38.063 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:38.064 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:38.064 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:38.065 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:38.065 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:38.065 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:38.065 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:38.065 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:38.065 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:38.071 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:38.071 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:38.072 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:47.777 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: bc4c3076c3a4431b94e39ce0e3537680 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:51:47.777 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: bc4c3076c3a4431b94e39ce0e3537680 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:51:47.778 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.778 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.778 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: bc4c3076c3a4431b94e39ce0e3537680 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:51:47.778 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: bc4c3076c3a4431b94e39ce0e3537680 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:51:47.778 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: bc4c3076c3a4431b94e39ce0e3537680 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:51:47.778 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.778 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.778 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:47.778 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:47.778 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.778 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: bc4c3076c3a4431b94e39ce0e3537680 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:51:47.779 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.779 134145 DEBUG oslo_concurrency.lockutils [req-af657c68-71bd-460f-8856-d46d00c4e58b 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:47.779 134140 DEBUG oslo_concurrency.lockutils [req-af657c68-71bd-460f-8856-d46d00c4e58b 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:47.779 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:47.779 134145 DEBUG oslo_concurrency.lockutils [req-af657c68-71bd-460f-8856-d46d00c4e58b 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:47.779 134140 DEBUG oslo_concurrency.lockutils [req-af657c68-71bd-460f-8856-d46d00c4e58b 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:47.780 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: bc4c3076c3a4431b94e39ce0e3537680 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:51:47.780 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:47.780 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:47.781 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.781 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.781 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:47.781 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:47.781 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.781 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: bc4c3076c3a4431b94e39ce0e3537680 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:51:47.781 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.781 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:47.783 134146 DEBUG oslo_concurrency.lockutils [req-af657c68-71bd-460f-8856-d46d00c4e58b 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:47.784 134146 DEBUG oslo_concurrency.lockutils [req-af657c68-71bd-460f-8856-d46d00c4e58b 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:47.784 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:47.785 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.785 134138 DEBUG oslo_concurrency.lockutils [req-af657c68-71bd-460f-8856-d46d00c4e58b 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:47.786 134138 DEBUG oslo_concurrency.lockutils [req-af657c68-71bd-460f-8856-d46d00c4e58b 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:47.786 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:47.786 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.787 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:47.785 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:47.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7e408f374a504a16a6b21f67232ae919 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:51:47.842 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7e408f374a504a16a6b21f67232ae919 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:51:47.842 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7e408f374a504a16a6b21f67232ae919 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:51:47.842 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7e408f374a504a16a6b21f67232ae919 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:51:47.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.843 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7e408f374a504a16a6b21f67232ae919 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:51:47.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.844 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:47.843 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:47.843 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7e408f374a504a16a6b21f67232ae919 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:51:47.845 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.845 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7e408f374a504a16a6b21f67232ae919 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:51:47.845 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.843 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.846 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7e408f374a504a16a6b21f67232ae919 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:51:47.847 134140 DEBUG oslo_concurrency.lockutils [req-bf4fe687-89e2-4754-9cdc-6a4d892a8bf7 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:47.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.847 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:47.847 134140 DEBUG oslo_concurrency.lockutils [req-bf4fe687-89e2-4754-9cdc-6a4d892a8bf7 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:47.847 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:47.846 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:47.848 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.848 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:47.848 134146 DEBUG oslo_concurrency.lockutils [req-bf4fe687-89e2-4754-9cdc-6a4d892a8bf7 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:47.849 134145 DEBUG oslo_concurrency.lockutils [req-bf4fe687-89e2-4754-9cdc-6a4d892a8bf7 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:47.849 134145 DEBUG oslo_concurrency.lockutils [req-bf4fe687-89e2-4754-9cdc-6a4d892a8bf7 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:47.848 134138 DEBUG oslo_concurrency.lockutils [req-bf4fe687-89e2-4754-9cdc-6a4d892a8bf7 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:47.849 134138 DEBUG oslo_concurrency.lockutils [req-bf4fe687-89e2-4754-9cdc-6a4d892a8bf7 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:47.850 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:47.850 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.850 134146 DEBUG oslo_concurrency.lockutils [req-bf4fe687-89e2-4754-9cdc-6a4d892a8bf7 02b56679ccf4449994ac69f952a46dd6 1f8f24a0e1684bffb76b2ddfbdc4eca6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:47.850 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:47.850 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:47.851 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.851 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:47.851 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:47.851 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:47.850 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:48.849 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:48.850 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:48.850 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:48.851 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:48.851 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:48.852 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:48.852 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:48.852 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:48.852 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:48.852 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:48.852 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:48.853 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:50.851 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:50.851 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:50.852 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:50.854 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:50.854 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:50.854 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:50.856 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:50.856 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:50.856 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:50.857 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:50.857 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:50.857 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:52.020 134138 DEBUG oslo_service.periodic_task [req-bce41ef0-2ebd-4b72-9b36-0e063a3ef3cc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:51:52.024 134138 DEBUG oslo_concurrency.lockutils [req-24b18efc-0cf2-49ad-be4c-813d78041a3a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:52.024 134138 DEBUG oslo_concurrency.lockutils [req-24b18efc-0cf2-49ad-be4c-813d78041a3a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:52.073 134146 DEBUG oslo_service.periodic_task [req-ca2e4d59-63d1-4416-ad30-bb825cbe4af7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:51:52.077 134146 DEBUG oslo_concurrency.lockutils [req-e2bc15fb-233d-40c4-a7d6-1b49b1792ec2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:52.078 134146 DEBUG oslo_concurrency.lockutils [req-e2bc15fb-233d-40c4-a7d6-1b49b1792ec2 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:54.854 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:54.854 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:54.854 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:54.857 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:54.857 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:54.857 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:54.858 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:54.858 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:54.858 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:54.860 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:54.860 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:54.860 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:57.640 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 950fb75b9c714179b88bb389959194c2 reply to reply_85578e9317f24a84af7e369b8d783622 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:51:57.640 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:57.640 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f214900b95ee4f8ca46ea91fdd819bf5 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:51:57.640 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:57.641 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:57.643 134140 DEBUG nova.scheduler.manager [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['d34ea5c5-3ed5-4173-8dec-a22e1a96128f'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:51:57.645 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:57.645 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:57.645 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:57.648 134140 DEBUG nova.scheduler.request_filter [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:51:57.648 134140 DEBUG nova.scheduler.request_filter [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:51:57.649 134140 DEBUG nova.scheduler.request_filter [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:51:57.649 134140 DEBUG nova.scheduler.request_filter [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:51:57.649 134140 DEBUG nova.scheduler.request_filter [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:51:57.654 134140 DEBUG oslo_concurrency.lockutils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:57.654 134140 DEBUG oslo_concurrency.lockutils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:57.816 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 4542441e1f31487c921919fa9530ca0a NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:51:57.819 134140 DEBUG oslo_concurrency.lockutils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:57.819 134140 DEBUG oslo_concurrency.lockutils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:57.830 134140 DEBUG oslo_concurrency.lockutils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:57.830 134140 DEBUG nova.scheduler.host_manager [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_1f8f24a0e1684bffb76b2ddfbdc4eca6='0',num_task_None='0',num_vm_active='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:51:48Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:51:57.832 134140 DEBUG nova.scheduler.host_manager [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:51:57.832 134140 DEBUG nova.scheduler.host_manager [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 417, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 51, 52, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 51, 52, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:51:57.832 134140 DEBUG nova.scheduler.host_manager [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:51:57.832 134140 DEBUG oslo_concurrency.lockutils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:57.833 134140 INFO nova.scheduler.host_manager [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:51:57.833 134140 DEBUG nova.scheduler.manager [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:51:57.833 134140 DEBUG nova.scheduler.manager [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:51:57.833 134140 DEBUG nova.scheduler.utils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance d34ea5c5-3ed5-4173-8dec-a22e1a96128f claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:51:57.930 134140 DEBUG nova.scheduler.manager [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: d34ea5c5-3ed5-4173-8dec-a22e1a96128f] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:51:57.931 134140 DEBUG oslo_concurrency.lockutils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:57.931 134140 DEBUG oslo_concurrency.lockutils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:57.934 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: b28c61bc69134a4b8201470480d9de37 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:51:57.937 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 950fb75b9c714179b88bb389959194c2 reply queue: reply_85578e9317f24a84af7e369b8d783622 time elapsed: 0.2971718580001834s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:51:58.062 134140 DEBUG oslo_service.periodic_task [req-8eda3e94-ad95-4014-996f-57d4aef20a38 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:51:58.069 134140 DEBUG oslo_concurrency.lockutils [req-b3f25709-1d32-49f7-8cfb-988d15e6a499 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:58.070 134140 DEBUG oslo_concurrency.lockutils [req-b3f25709-1d32-49f7-8cfb-988d15e6a499 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:58.646 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:58.646 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:58.646 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:58.884 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 5267b8666a3c41219a15edf690b51d2d reply to reply_979ed65011a945248f9b38f6dd19b658 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:51:58.885 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:58.885 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5e9de53fa5f54672b51e7e2dc60344a2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:51:58.885 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:58.885 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:58.887 134146 DEBUG nova.scheduler.manager [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['7e180c3c-053f-4d5c-afd6-860a4e0bfd03'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:51:58.888 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:58.888 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:58.889 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:51:58.895 134146 DEBUG nova.scheduler.request_filter [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:51:58.895 134146 DEBUG nova.scheduler.request_filter [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:51:58.895 134146 DEBUG nova.scheduler.request_filter [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:51:58.895 134146 DEBUG nova.scheduler.request_filter [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:51:58.896 134146 DEBUG nova.scheduler.request_filter [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:51:58.900 134146 DEBUG oslo_concurrency.lockutils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:58.901 134146 DEBUG oslo_concurrency.lockutils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:59.005 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 69f7714b80334ba3a19958c098553e1e NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:51:59.007 134146 DEBUG oslo_concurrency.lockutils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:59.008 134146 DEBUG oslo_concurrency.lockutils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:59.017 134146 DEBUG oslo_concurrency.lockutils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:59.017 134146 DEBUG nova.scheduler.host_manager [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_1f8f24a0e1684bffb76b2ddfbdc4eca6='0',num_proj_bd5129d12b6d437f895b62124e142c13='1',num_task_None='1',num_vm_active='0',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:51:58Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:51:59.018 134146 DEBUG nova.scheduler.host_manager [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:51:59.018 134146 DEBUG nova.scheduler.host_manager [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 417, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 51, 52, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 51, 52, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:51:59.019 134146 DEBUG nova.scheduler.host_manager [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:51:59.019 134146 DEBUG oslo_concurrency.lockutils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:59.019 134146 INFO nova.scheduler.host_manager [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:51:59.019 134146 DEBUG nova.scheduler.manager [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:51:59.019 134146 DEBUG nova.scheduler.manager [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:51:59.020 134146 DEBUG nova.scheduler.utils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance 7e180c3c-053f-4d5c-afd6-860a4e0bfd03 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:51:59.101 134146 DEBUG nova.scheduler.manager [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: 7e180c3c-053f-4d5c-afd6-860a4e0bfd03] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:51:59.102 134146 DEBUG oslo_concurrency.lockutils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:51:59.102 134146 DEBUG oslo_concurrency.lockutils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:51:59.103 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 2b11d717876c471486829f2cf95dfaec NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:51:59.106 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 5267b8666a3c41219a15edf690b51d2d reply queue: reply_979ed65011a945248f9b38f6dd19b658 time elapsed: 0.22121875900029409s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:51:59.890 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:51:59.891 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:51:59.891 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:00.649 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:00.649 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:00.649 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:01.520 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f036230e059b4b04ab1f7e5838fd3a10 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:01.520 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f036230e059b4b04ab1f7e5838fd3a10 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:01.520 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f036230e059b4b04ab1f7e5838fd3a10 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:01.520 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:01.521 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:01.521 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f036230e059b4b04ab1f7e5838fd3a10 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:01.521 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f036230e059b4b04ab1f7e5838fd3a10 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:01.520 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:01.521 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:01.521 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f036230e059b4b04ab1f7e5838fd3a10 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:01.521 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:01.521 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:01.521 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:01.521 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:01.521 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:01.523 134146 DEBUG oslo_concurrency.lockutils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:01.523 134145 DEBUG oslo_concurrency.lockutils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:01.523 134146 DEBUG oslo_concurrency.lockutils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:01.523 134138 DEBUG oslo_concurrency.lockutils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:01.523 134145 DEBUG oslo_concurrency.lockutils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:01.524 134138 DEBUG oslo_concurrency.lockutils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:01.524 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:01.524 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:01.524 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:01.524 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:01.524 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:01.524 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:01.524 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:01.524 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:01.524 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:01.525 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f036230e059b4b04ab1f7e5838fd3a10 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:01.525 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:01.525 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f036230e059b4b04ab1f7e5838fd3a10 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:01.526 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:01.526 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:01.528 134140 DEBUG oslo_concurrency.lockutils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:01.528 134140 DEBUG oslo_concurrency.lockutils [req-41a6e450-22ea-4829-95d0-fb07adbe8f5b 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:01.528 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:01.528 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:01.529 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:02.525 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:02.525 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:02.525 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:02.526 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:02.526 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:02.526 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:02.526 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:02.526 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:02.526 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:02.529 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:02.529 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:02.530 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:02.665 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 196bd4922587469f8ed9bebfe833d514 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:02.665 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 196bd4922587469f8ed9bebfe833d514 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:02.666 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 196bd4922587469f8ed9bebfe833d514 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:02.666 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:02.666 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:02.666 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:02.666 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 196bd4922587469f8ed9bebfe833d514 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:02.666 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 196bd4922587469f8ed9bebfe833d514 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:02.666 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 196bd4922587469f8ed9bebfe833d514 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:02.666 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:02.666 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:02.666 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:02.666 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:02.666 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:02.666 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:02.668 134145 DEBUG oslo_concurrency.lockutils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:02.669 134145 DEBUG oslo_concurrency.lockutils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:02.669 134138 DEBUG oslo_concurrency.lockutils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:02.669 134146 DEBUG oslo_concurrency.lockutils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:02.669 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:02.669 134138 DEBUG oslo_concurrency.lockutils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:02.669 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:02.669 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:02.669 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:02.669 134146 DEBUG oslo_concurrency.lockutils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:02.669 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:02.669 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:02.670 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:02.670 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:02.670 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:02.670 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 196bd4922587469f8ed9bebfe833d514 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:02.671 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:02.671 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 196bd4922587469f8ed9bebfe833d514 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:02.671 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:02.671 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:02.673 134140 DEBUG oslo_concurrency.lockutils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:02.673 134140 DEBUG oslo_concurrency.lockutils [req-75dca53d-ec04-4f8a-99e7-97444cb7f879 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:02.674 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:02.674 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:02.674 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:03.670 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:03.670 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:03.671 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:03.671 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:03.671 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:03.671 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:03.671 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:03.672 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:03.672 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:03.676 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:03.676 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:03.676 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:05.672 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:05.673 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:05.673 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:05.673 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:05.673 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:05.673 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:05.674 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:05.674 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:05.674 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:05.678 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:05.678 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:05.678 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:07.111 134145 DEBUG oslo_service.periodic_task [req-fda1e428-d73b-48df-a5e3-ddd89b092f81 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:52:07.115 134145 DEBUG oslo_concurrency.lockutils [req-531bdeff-4dd1-4941-b61a-4ea3e4e4c5a5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:07.115 134145 DEBUG oslo_concurrency.lockutils [req-531bdeff-4dd1-4941-b61a-4ea3e4e4c5a5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:09.677 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:09.677 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:09.677 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:09.677 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:09.677 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:09.677 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:09.678 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:09.678 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:09.678 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:09.682 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:09.682 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:09.682 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:17.680 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:17.680 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:17.680 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:17.681 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:17.681 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:17.681 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:17.682 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:17.682 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:17.682 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:17.684 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:17.684 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:17.685 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:22.084 134146 DEBUG oslo_service.periodic_task [req-e2bc15fb-233d-40c4-a7d6-1b49b1792ec2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:52:22.089 134146 DEBUG oslo_concurrency.lockutils [req-8e85047f-6c8e-410d-85e3-427b0f9673d9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:22.089 134146 DEBUG oslo_concurrency.lockutils [req-8e85047f-6c8e-410d-85e3-427b0f9673d9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:23.019 134138 DEBUG oslo_service.periodic_task [req-24b18efc-0cf2-49ad-be4c-813d78041a3a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:52:23.024 134138 DEBUG oslo_concurrency.lockutils [req-44f6919a-93ed-4166-bb78-5f514a8a3e07 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:23.024 134138 DEBUG oslo_concurrency.lockutils [req-44f6919a-93ed-4166-bb78-5f514a8a3e07 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:24.383 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5fa0713837034cbd9a2bb6f57342324a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:24.383 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5fa0713837034cbd9a2bb6f57342324a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:24.383 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5fa0713837034cbd9a2bb6f57342324a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:24.384 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.384 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.384 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5fa0713837034cbd9a2bb6f57342324a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:24.384 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5fa0713837034cbd9a2bb6f57342324a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:24.384 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.384 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.384 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5fa0713837034cbd9a2bb6f57342324a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:24.384 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:24.384 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.384 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:24.384 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.385 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:24.384 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5fa0713837034cbd9a2bb6f57342324a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:24.385 134145 DEBUG oslo_concurrency.lockutils [req-0e95881a-a378-4911-aca6-fbed7dd1bf45 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:24.385 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.385 134138 DEBUG oslo_concurrency.lockutils [req-0e95881a-a378-4911-aca6-fbed7dd1bf45 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:24.385 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5fa0713837034cbd9a2bb6f57342324a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:24.385 134145 DEBUG oslo_concurrency.lockutils [req-0e95881a-a378-4911-aca6-fbed7dd1bf45 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:24.385 134138 DEBUG oslo_concurrency.lockutils [req-0e95881a-a378-4911-aca6-fbed7dd1bf45 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:24.385 134146 DEBUG oslo_concurrency.lockutils [req-0e95881a-a378-4911-aca6-fbed7dd1bf45 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:24.385 134146 DEBUG oslo_concurrency.lockutils [req-0e95881a-a378-4911-aca6-fbed7dd1bf45 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:24.385 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.386 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:24.386 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:24.386 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.386 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:24.387 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:24.387 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:24.387 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.387 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:24.387 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.387 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:24.387 134140 DEBUG oslo_concurrency.lockutils [req-0e95881a-a378-4911-aca6-fbed7dd1bf45 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:24.388 134140 DEBUG oslo_concurrency.lockutils [req-0e95881a-a378-4911-aca6-fbed7dd1bf45 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:24.388 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:24.388 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.389 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:24.459 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c269d51e993a42b6aaae90647de29363 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:24.459 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c269d51e993a42b6aaae90647de29363 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:24.459 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c269d51e993a42b6aaae90647de29363 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:24.459 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.460 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c269d51e993a42b6aaae90647de29363 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:24.460 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.460 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.460 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c269d51e993a42b6aaae90647de29363 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:24.460 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.460 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c269d51e993a42b6aaae90647de29363 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:24.460 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:24.460 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.460 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.460 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:24.460 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:24.460 134138 DEBUG oslo_concurrency.lockutils [req-59125c9b-b876-438b-a69d-3958765e9ca7 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:24.461 134138 DEBUG oslo_concurrency.lockutils [req-59125c9b-b876-438b-a69d-3958765e9ca7 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:24.461 134145 DEBUG oslo_concurrency.lockutils [req-59125c9b-b876-438b-a69d-3958765e9ca7 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:24.461 134140 DEBUG oslo_concurrency.lockutils [req-59125c9b-b876-438b-a69d-3958765e9ca7 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:24.461 134145 DEBUG oslo_concurrency.lockutils [req-59125c9b-b876-438b-a69d-3958765e9ca7 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:24.461 134140 DEBUG oslo_concurrency.lockutils [req-59125c9b-b876-438b-a69d-3958765e9ca7 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:24.461 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:24.462 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.462 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:24.462 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:24.462 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.462 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:24.462 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:24.462 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.462 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:24.463 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c269d51e993a42b6aaae90647de29363 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:24.463 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.463 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c269d51e993a42b6aaae90647de29363 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:24.463 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.463 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:24.464 134146 DEBUG oslo_concurrency.lockutils [req-59125c9b-b876-438b-a69d-3958765e9ca7 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:24.464 134146 DEBUG oslo_concurrency.lockutils [req-59125c9b-b876-438b-a69d-3958765e9ca7 50499585da76413d8ec8e3869bd7b484 bd5129d12b6d437f895b62124e142c13 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:24.465 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:24.465 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:24.465 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:25.463 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:25.463 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:25.463 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:25.463 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:25.463 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:25.464 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:25.464 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:25.464 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:25.465 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:25.466 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:25.466 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:25.466 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:27.466 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:27.466 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:27.466 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:27.466 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:27.466 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:27.466 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:27.467 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:27.467 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:27.467 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:27.468 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:27.468 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:27.468 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:28.077 134140 DEBUG oslo_service.periodic_task [req-b3f25709-1d32-49f7-8cfb-988d15e6a499 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:52:28.081 134140 DEBUG oslo_concurrency.lockutils [req-9e274460-b973-4c72-8338-11efa8efe901 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:28.082 134140 DEBUG oslo_concurrency.lockutils [req-9e274460-b973-4c72-8338-11efa8efe901 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:31.467 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:31.467 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:31.468 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:31.470 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:31.470 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:31.470 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:31.470 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:31.470 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:31.470 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:31.471 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:31.471 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:31.472 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:32.388 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 8ee1638e30df495bb535d7c3fb1bbbdb reply to reply_85578e9317f24a84af7e369b8d783622 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:52:32.389 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:32.389 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 37e4675e09574f7b96439b97ca9f6e58 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:32.389 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:32.389 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:32.391 134138 DEBUG nova.scheduler.manager [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['85440c07-c1fb-4ca0-bd8f-0961f3225e52'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:52:32.393 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:32.393 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:32.394 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:32.396 134138 DEBUG nova.scheduler.request_filter [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:52:32.397 134138 DEBUG nova.scheduler.request_filter [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:52:32.397 134138 DEBUG nova.scheduler.request_filter [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:52:32.397 134138 DEBUG nova.scheduler.request_filter [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:52:32.397 134138 DEBUG nova.scheduler.request_filter [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:52:32.401 134138 DEBUG oslo_concurrency.lockutils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:32.402 134138 DEBUG oslo_concurrency.lockutils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:32.457 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: b3dead3e4ece4eb2b14a897f23228a34 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:52:32.459 134138 DEBUG oslo_concurrency.lockutils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:32.459 134138 DEBUG oslo_concurrency.lockutils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:32.468 134138 DEBUG oslo_concurrency.lockutils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:32.469 134138 DEBUG nova.scheduler.host_manager [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:52:30Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:52:32.470 134138 DEBUG nova.scheduler.host_manager [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:52:32.472 134138 DEBUG nova.scheduler.host_manager [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 421, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 52, 32, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 52, 32, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:52:32.473 134138 DEBUG nova.scheduler.host_manager [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:52:32.473 134138 DEBUG oslo_concurrency.lockutils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.005s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:32.473 134138 INFO nova.scheduler.host_manager [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:52:32.473 134138 DEBUG nova.scheduler.manager [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:52:32.474 134138 DEBUG nova.scheduler.manager [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:52:32.474 134138 DEBUG nova.scheduler.utils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance 85440c07-c1fb-4ca0-bd8f-0961f3225e52 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:52:32.598 134138 DEBUG nova.scheduler.manager [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: 85440c07-c1fb-4ca0-bd8f-0961f3225e52] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:52:32.599 134138 DEBUG oslo_concurrency.lockutils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:32.599 134138 DEBUG oslo_concurrency.lockutils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:32.600 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 7f62ab1e41814233ac32718e48b43a4a NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:52:32.603 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 8ee1638e30df495bb535d7c3fb1bbbdb reply queue: reply_85578e9317f24a84af7e369b8d783622 time elapsed: 0.2141535640002985s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:52:33.395 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:33.396 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:33.396 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:35.398 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:35.398 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:35.399 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:35.802 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7495000c7f214e528f8e6ec749bf293b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:35.802 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7495000c7f214e528f8e6ec749bf293b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:35.802 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:35.802 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7495000c7f214e528f8e6ec749bf293b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:35.802 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:35.802 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7495000c7f214e528f8e6ec749bf293b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:35.802 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:35.802 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:35.803 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:35.803 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:35.804 134138 DEBUG oslo_concurrency.lockutils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:35.805 134138 DEBUG oslo_concurrency.lockutils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:35.805 134145 DEBUG oslo_concurrency.lockutils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:35.805 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:35.805 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:35.805 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:35.805 134145 DEBUG oslo_concurrency.lockutils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:35.806 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:35.806 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7495000c7f214e528f8e6ec749bf293b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:35.806 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:35.806 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:35.807 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:35.807 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7495000c7f214e528f8e6ec749bf293b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:35.810 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:35.810 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7495000c7f214e528f8e6ec749bf293b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:35.810 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7495000c7f214e528f8e6ec749bf293b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:35.811 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:35.811 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:35.811 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:35.813 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:35.815 134140 DEBUG oslo_concurrency.lockutils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:35.818 134146 DEBUG oslo_concurrency.lockutils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:35.818 134140 DEBUG oslo_concurrency.lockutils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:35.818 134146 DEBUG oslo_concurrency.lockutils [req-1fa9f0f4-7eac-44be-a8b7-b82c4d8149c2 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:35.819 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:35.819 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:35.819 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:35.819 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:35.819 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:35.819 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:36.806 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:36.806 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:36.806 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:36.808 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:36.808 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:36.808 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:36.821 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:36.821 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:36.822 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:36.822 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:36.822 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:36.822 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:37.910 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ddf640a9f4bf4a14a605113e4838c835 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:37.910 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ddf640a9f4bf4a14a605113e4838c835 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:37.910 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ddf640a9f4bf4a14a605113e4838c835 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:37.910 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ddf640a9f4bf4a14a605113e4838c835 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:37.911 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:37.911 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:37.910 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:37.911 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ddf640a9f4bf4a14a605113e4838c835 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:37.911 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ddf640a9f4bf4a14a605113e4838c835 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:37.911 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ddf640a9f4bf4a14a605113e4838c835 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:37.911 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:37.911 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:37.911 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:37.911 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:37.911 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:37.911 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ddf640a9f4bf4a14a605113e4838c835 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:37.911 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:37.911 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:37.911 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:37.911 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:37.912 134145 DEBUG oslo_concurrency.lockutils [req-53b6f1a2-506a-4d23-adf7-15706d739dea - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:37.912 134146 DEBUG oslo_concurrency.lockutils [req-53b6f1a2-506a-4d23-adf7-15706d739dea - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:37.912 134140 DEBUG oslo_concurrency.lockutils [req-53b6f1a2-506a-4d23-adf7-15706d739dea - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:37.912 134146 DEBUG nova.scheduler.host_manager [req-53b6f1a2-506a-4d23-adf7-15706d739dea - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:52:37.912 134145 DEBUG nova.scheduler.host_manager [req-53b6f1a2-506a-4d23-adf7-15706d739dea - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:52:37.912 134140 DEBUG nova.scheduler.host_manager [req-53b6f1a2-506a-4d23-adf7-15706d739dea - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:52:37.912 134145 DEBUG oslo_concurrency.lockutils [req-53b6f1a2-506a-4d23-adf7-15706d739dea - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:37.912 134146 DEBUG oslo_concurrency.lockutils [req-53b6f1a2-506a-4d23-adf7-15706d739dea - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:37.912 134140 DEBUG oslo_concurrency.lockutils [req-53b6f1a2-506a-4d23-adf7-15706d739dea - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:37.912 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:37.912 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:37.912 134138 DEBUG oslo_concurrency.lockutils [req-53b6f1a2-506a-4d23-adf7-15706d739dea - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:37.912 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:37.913 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:37.913 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:37.913 134138 DEBUG nova.scheduler.host_manager [req-53b6f1a2-506a-4d23-adf7-15706d739dea - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:52:37.913 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:37.913 134138 DEBUG oslo_concurrency.lockutils [req-53b6f1a2-506a-4d23-adf7-15706d739dea - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:37.913 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:37.913 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:37.913 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:37.913 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:37.914 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:37.914 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:38.051 134145 DEBUG oslo_service.periodic_task [req-531bdeff-4dd1-4941-b61a-4ea3e4e4c5a5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:52:38.056 134145 DEBUG oslo_concurrency.lockutils [req-eb8927b8-2a9c-4424-b35b-eaf8f23ccb2f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:38.056 134145 DEBUG oslo_concurrency.lockutils [req-eb8927b8-2a9c-4424-b35b-eaf8f23ccb2f - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:38.914 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:38.914 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:38.914 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:38.914 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:38.914 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:38.915 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:38.915 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:38.915 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:38.915 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:38.915 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:38.915 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:38.915 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:40.916 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:40.916 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:40.916 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:40.917 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:40.917 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:40.917 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:40.917 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:40.917 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:40.917 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:40.918 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:40.918 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:40.918 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:44.921 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:44.921 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:44.921 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:44.921 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:44.921 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:44.921 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:44.922 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:44.922 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:44.922 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:44.923 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:44.923 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:44.923 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:49.948 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4dec4f568af9430db44d51403d9b4575 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:49.948 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4dec4f568af9430db44d51403d9b4575 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:49.948 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:49.948 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:49.948 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4dec4f568af9430db44d51403d9b4575 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:49.948 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4dec4f568af9430db44d51403d9b4575 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:49.948 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:49.948 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:49.949 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:49.949 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:49.949 134138 DEBUG oslo_concurrency.lockutils [req-d1eb653c-ba1a-4ea8-b499-8a5ad2a73d60 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:49.949 134145 DEBUG oslo_concurrency.lockutils [req-d1eb653c-ba1a-4ea8-b499-8a5ad2a73d60 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:49.949 134138 DEBUG oslo_concurrency.lockutils [req-d1eb653c-ba1a-4ea8-b499-8a5ad2a73d60 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:49.949 134145 DEBUG oslo_concurrency.lockutils [req-d1eb653c-ba1a-4ea8-b499-8a5ad2a73d60 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:49.949 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4dec4f568af9430db44d51403d9b4575 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:49.950 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:49.950 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4dec4f568af9430db44d51403d9b4575 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:49.951 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:49.951 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:49.951 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:49.951 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:49.951 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:49.951 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:49.951 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:49.951 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:49.951 134140 DEBUG oslo_concurrency.lockutils [req-d1eb653c-ba1a-4ea8-b499-8a5ad2a73d60 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:49.951 134140 DEBUG oslo_concurrency.lockutils [req-d1eb653c-ba1a-4ea8-b499-8a5ad2a73d60 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:49.952 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4dec4f568af9430db44d51403d9b4575 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:49.953 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:49.953 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4dec4f568af9430db44d51403d9b4575 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:49.953 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:49.953 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:49.954 134146 DEBUG oslo_concurrency.lockutils [req-d1eb653c-ba1a-4ea8-b499-8a5ad2a73d60 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:49.954 134146 DEBUG oslo_concurrency.lockutils [req-d1eb653c-ba1a-4ea8-b499-8a5ad2a73d60 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:49.954 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:49.954 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:49.955 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:49.955 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:49.956 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:49.956 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:50.953 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:50.953 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:50.953 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:50.953 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:50.953 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:50.953 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:50.955 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:50.956 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:50.956 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:50.957 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:50.957 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:50.957 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:52.095 134146 DEBUG oslo_service.periodic_task [req-8e85047f-6c8e-410d-85e3-427b0f9673d9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:52:52.100 134146 DEBUG oslo_concurrency.lockutils [req-4c382421-9c0b-4333-9c8c-141623220080 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:52.100 134146 DEBUG oslo_concurrency.lockutils [req-4c382421-9c0b-4333-9c8c-141623220080 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:52.954 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:52.954 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:52.954 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:52.954 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:52.954 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:52.954 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:52.957 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:52.957 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:52.957 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:52.958 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:52.959 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:52.959 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:53.029 134138 DEBUG oslo_service.periodic_task [req-44f6919a-93ed-4166-bb78-5f514a8a3e07 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:52:53.033 134138 DEBUG oslo_concurrency.lockutils [req-7fcba006-2a98-47b7-ae03-fb4b09ffdd5d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:53.034 134138 DEBUG oslo_concurrency.lockutils [req-7fcba006-2a98-47b7-ae03-fb4b09ffdd5d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:53.802 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 5f8e4d59917149f3ab3f671d7712c6d4 reply to reply_85578e9317f24a84af7e369b8d783622 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:52:53.802 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:53.802 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ac868b93a1ec44c9a56bfd4fe71e4ec3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:53.802 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:53.803 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:53.804 134145 DEBUG nova.scheduler.manager [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['6b54ece2-97ea-450f-9549-e535c900b46f'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:52:53.806 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:53.806 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:53.806 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:53.809 134145 DEBUG nova.scheduler.request_filter [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:52:53.810 134145 DEBUG nova.scheduler.request_filter [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:52:53.810 134145 DEBUG nova.scheduler.request_filter [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:52:53.810 134145 DEBUG nova.scheduler.request_filter [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:52:53.810 134145 DEBUG nova.scheduler.request_filter [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:52:53.814 134145 DEBUG oslo_concurrency.lockutils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:53.815 134145 DEBUG oslo_concurrency.lockutils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:53.865 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 0c442c5c4a3d426688fb225c0b03e059 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:52:53.867 134145 DEBUG oslo_concurrency.lockutils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:53.867 134145 DEBUG oslo_concurrency.lockutils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:53.876 134145 DEBUG oslo_concurrency.lockutils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:53.876 134145 DEBUG nova.scheduler.host_manager [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_f33673cce13045b58a0d813caec251fb='0',num_task_None='0',num_vm_building='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:52:50Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:52:53.877 134145 DEBUG nova.scheduler.host_manager [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:52:53.877 134145 DEBUG nova.scheduler.host_manager [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 423, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 52, 52, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 52, 52, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:52:53.877 134145 DEBUG nova.scheduler.host_manager [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:52:53.877 134145 DEBUG oslo_concurrency.lockutils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:53.878 134145 INFO nova.scheduler.host_manager [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:52:53.878 134145 DEBUG nova.scheduler.manager [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:52:53.878 134145 DEBUG nova.scheduler.manager [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:52:53.879 134145 DEBUG nova.scheduler.utils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance 6b54ece2-97ea-450f-9549-e535c900b46f claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:52:53.951 134145 DEBUG nova.scheduler.manager [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: 6b54ece2-97ea-450f-9549-e535c900b46f] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:52:53.952 134145 DEBUG oslo_concurrency.lockutils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:53.952 134145 DEBUG oslo_concurrency.lockutils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:53.953 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: b26b115bd9214f888367e35e8726d2a2 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:52:53.955 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 5f8e4d59917149f3ab3f671d7712c6d4 reply queue: reply_85578e9317f24a84af7e369b8d783622 time elapsed: 0.15303940200010402s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:52:54.808 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:54.808 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:54.808 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:56.809 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:56.810 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:56.810 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:56.958 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:56.958 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:56.958 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:56.959 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:56.959 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:56.960 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:56.961 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:56.961 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:56.961 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:56.978 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1a9b9163e4d448feb9bd8ba8ba0a5abe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:56.978 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1a9b9163e4d448feb9bd8ba8ba0a5abe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:56.978 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1a9b9163e4d448feb9bd8ba8ba0a5abe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:56.978 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:56.978 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:56.978 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:56.979 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1a9b9163e4d448feb9bd8ba8ba0a5abe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:56.979 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1a9b9163e4d448feb9bd8ba8ba0a5abe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:56.979 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:56.979 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:56.979 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:56.979 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:56.979 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1a9b9163e4d448feb9bd8ba8ba0a5abe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:56.980 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:56.980 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:56.981 134146 DEBUG oslo_concurrency.lockutils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:56.981 134146 DEBUG oslo_concurrency.lockutils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:56.982 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:56.982 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:56.982 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:56.981 134140 DEBUG oslo_concurrency.lockutils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:56.982 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1a9b9163e4d448feb9bd8ba8ba0a5abe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:52:56.982 134145 DEBUG oslo_concurrency.lockutils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:56.983 134145 DEBUG oslo_concurrency.lockutils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:56.983 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:56.983 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1a9b9163e4d448feb9bd8ba8ba0a5abe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:52:56.983 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:56.983 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:56.983 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:56.983 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:56.983 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:56.983 134140 DEBUG oslo_concurrency.lockutils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:56.984 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:56.984 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:56.984 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:56.986 134138 DEBUG oslo_concurrency.lockutils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:56.986 134138 DEBUG oslo_concurrency.lockutils [req-7db34d56-dfc2-4e58-ab1e-e6ce16f84ac8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:56.987 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:56.987 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:56.987 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:57.984 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:57.984 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:57.984 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:57.984 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:57.984 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:57.984 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:57.985 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:57.985 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:57.985 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:57.988 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:57.988 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:57.988 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:58.087 134140 DEBUG oslo_service.periodic_task [req-9e274460-b973-4c72-8338-11efa8efe901 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:52:58.093 134140 DEBUG oslo_concurrency.lockutils [req-c698a2d9-d001-4240-b596-8bf8044966d5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:52:58.093 134140 DEBUG oslo_concurrency.lockutils [req-c698a2d9-d001-4240-b596-8bf8044966d5 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:52:59.986 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:59.986 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:59.986 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:59.987 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:59.987 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:59.987 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:59.988 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:59.988 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:59.988 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:52:59.991 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:52:59.991 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:52:59.991 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:03.990 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:03.990 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:03.990 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:03.990 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:03.991 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:03.991 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:03.991 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:03.991 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:03.991 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:03.993 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:03.993 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:03.994 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:09.052 134145 DEBUG oslo_service.periodic_task [req-eb8927b8-2a9c-4424-b35b-eaf8f23ccb2f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:53:09.056 134145 DEBUG oslo_concurrency.lockutils [req-d9094395-6802-45a9-b088-8700b91a8be9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:09.057 134145 DEBUG oslo_concurrency.lockutils [req-d9094395-6802-45a9-b088-8700b91a8be9 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:11.994 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:11.994 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:11.994 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:11.994 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:11.994 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:11.995 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:11.996 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:11.996 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:11.997 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:12.002 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:12.002 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:12.002 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:15.728 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4cb71ab734a14d6592be3044a46b2abe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:15.728 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4cb71ab734a14d6592be3044a46b2abe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:15.728 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4cb71ab734a14d6592be3044a46b2abe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:15.728 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4cb71ab734a14d6592be3044a46b2abe __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:15.728 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:15.728 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:15.728 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4cb71ab734a14d6592be3044a46b2abe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:15.728 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4cb71ab734a14d6592be3044a46b2abe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:15.728 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:15.728 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4cb71ab734a14d6592be3044a46b2abe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:15.728 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:15.728 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:15.728 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:15.728 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:15.729 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:15.729 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:15.729 134140 DEBUG oslo_concurrency.lockutils [req-cfd63af5-3ff0-4820-b04b-79b089f885d3 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:15.729 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:15.729 134140 DEBUG oslo_concurrency.lockutils [req-cfd63af5-3ff0-4820-b04b-79b089f885d3 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:15.729 134138 DEBUG oslo_concurrency.lockutils [req-cfd63af5-3ff0-4820-b04b-79b089f885d3 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:15.729 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4cb71ab734a14d6592be3044a46b2abe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:15.729 134145 DEBUG oslo_concurrency.lockutils [req-cfd63af5-3ff0-4820-b04b-79b089f885d3 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:15.729 134138 DEBUG oslo_concurrency.lockutils [req-cfd63af5-3ff0-4820-b04b-79b089f885d3 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:15.730 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:15.729 134145 DEBUG oslo_concurrency.lockutils [req-cfd63af5-3ff0-4820-b04b-79b089f885d3 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:15.730 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:15.730 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:15.730 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:15.730 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:15.730 134146 DEBUG oslo_concurrency.lockutils [req-cfd63af5-3ff0-4820-b04b-79b089f885d3 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:15.731 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:15.731 134146 DEBUG oslo_concurrency.lockutils [req-cfd63af5-3ff0-4820-b04b-79b089f885d3 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:15.731 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:15.731 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:15.731 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:15.731 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:15.731 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:15.732 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:15.732 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:15.732 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:16.731 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:16.732 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:16.732 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:16.732 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:16.732 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:16.732 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:16.733 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:16.733 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:16.733 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:16.734 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:16.735 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:16.735 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:18.734 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:18.734 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:18.734 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:18.735 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:18.735 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:18.735 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:18.735 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:18.735 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:18.735 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:18.737 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:18.738 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:18.738 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:20.528 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 36de9123f8a7421a8c9bc4faab26ebd5 reply to reply_b20bc86560af44159724b4fea607f8c2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:53:20.528 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:20.528 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 91245e9d7824406ea66793da64cd5d67 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:20.529 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:20.529 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:20.531 134140 DEBUG nova.scheduler.manager [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['b8708797-50a2-42f1-9a3a-54c40336f4c8'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:53:20.532 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:20.532 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:20.533 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:20.537 134140 DEBUG nova.scheduler.request_filter [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:53:20.537 134140 DEBUG nova.scheduler.request_filter [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:53:20.537 134140 DEBUG nova.scheduler.request_filter [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:53:20.538 134140 DEBUG nova.scheduler.request_filter [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:53:20.965 134140 DEBUG nova.scheduler.request_filter [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.4 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:53:20.970 134140 DEBUG oslo_concurrency.lockutils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:20.970 134140 DEBUG oslo_concurrency.lockutils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:21.030 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 96472056533f46ef92809db35229a289 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:53:21.032 134140 DEBUG oslo_concurrency.lockutils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:21.032 134140 DEBUG oslo_concurrency.lockutils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:21.040 134140 DEBUG oslo_concurrency.lockutils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:21.040 134140 DEBUG nova.scheduler.host_manager [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_f33673cce13045b58a0d813caec251fb='0',num_task_None='0',num_vm_building='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:53:16Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:53:21.041 134140 DEBUG nova.scheduler.host_manager [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:53:21.041 134140 DEBUG nova.scheduler.host_manager [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 425, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 53, 12, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 53, 12, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:53:21.042 134140 DEBUG nova.scheduler.host_manager [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:53:21.042 134140 DEBUG oslo_concurrency.lockutils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:21.042 134140 INFO nova.scheduler.host_manager [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:53:21.042 134140 DEBUG nova.scheduler.manager [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:53:21.042 134140 DEBUG nova.scheduler.manager [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:53:21.043 134140 DEBUG nova.scheduler.utils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance b8708797-50a2-42f1-9a3a-54c40336f4c8 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:53:21.124 134140 DEBUG nova.scheduler.manager [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: b8708797-50a2-42f1-9a3a-54c40336f4c8] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:53:21.125 134140 DEBUG oslo_concurrency.lockutils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:21.125 134140 DEBUG oslo_concurrency.lockutils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:21.127 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 71e0473377074781be3df1cec15b5ea9 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:53:21.129 134140 DEBUG oslo_messaging._drivers.amqpdriver [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 36de9123f8a7421a8c9bc4faab26ebd5 reply queue: reply_b20bc86560af44159724b4fea607f8c2 time elapsed: 0.6010669690003851s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:53:21.455 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 6cfb414c41b544079ce374837a185def reply to reply_b20bc86560af44159724b4fea607f8c2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:53:21.456 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:21.456 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ecb586b630034331b4cc17ca37e298f0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:21.456 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:21.456 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:21.458 134146 DEBUG nova.scheduler.manager [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['f337133e-017e-46db-9645-afff2c4396f1'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:53:21.460 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:21.460 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:21.460 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:21.462 134146 DEBUG nova.scheduler.request_filter [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:53:21.463 134146 DEBUG nova.scheduler.request_filter [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:53:21.463 134146 DEBUG nova.scheduler.request_filter [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:53:21.463 134146 DEBUG nova.scheduler.request_filter [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:53:21.534 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:21.535 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:21.535 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:21.581 134146 DEBUG nova.scheduler.request_filter [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.1 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:53:21.589 134146 DEBUG oslo_concurrency.lockutils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:21.589 134146 DEBUG oslo_concurrency.lockutils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:21.642 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 72d6214fda6f450782d092bd419443a5 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:53:21.644 134146 DEBUG oslo_concurrency.lockutils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:21.644 134146 DEBUG oslo_concurrency.lockutils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:21.655 134146 DEBUG oslo_concurrency.lockutils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:21.655 134146 DEBUG nova.scheduler.host_manager [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_f33673cce13045b58a0d813caec251fb='1',num_task_None='1',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:53:21Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:53:21.656 134146 DEBUG nova.scheduler.host_manager [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:53:21.657 134146 DEBUG nova.scheduler.host_manager [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 425, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 53, 12, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 53, 12, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:53:21.658 134146 DEBUG nova.scheduler.host_manager [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:53:21.658 134146 DEBUG oslo_concurrency.lockutils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:21.659 134146 INFO nova.scheduler.host_manager [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:53:21.659 134146 DEBUG nova.scheduler.manager [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:53:21.659 134146 DEBUG nova.scheduler.manager [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:53:21.659 134146 DEBUG nova.scheduler.utils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance f337133e-017e-46db-9645-afff2c4396f1 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:53:21.735 134146 DEBUG nova.scheduler.manager [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: f337133e-017e-46db-9645-afff2c4396f1] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:53:21.736 134146 DEBUG oslo_concurrency.lockutils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:21.736 134146 DEBUG oslo_concurrency.lockutils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:21.738 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: a78c3df376ab41919a78cf633c8fa244 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:53:21.739 134146 DEBUG oslo_messaging._drivers.amqpdriver [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 6cfb414c41b544079ce374837a185def reply queue: reply_b20bc86560af44159724b4fea607f8c2 time elapsed: 0.2833493100006308s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:53:22.462 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:22.462 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:22.462 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:22.678 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 11a515b6ddb245888a15cad4b7e8ffa9 reply to reply_196060a39108420a873d953cb9a1134e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:53:22.679 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:22.679 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 80867d8cb2b74b4882f21a3a3ac10f15 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:22.679 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:22.679 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:22.681 134138 DEBUG nova.scheduler.manager [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['0076cb7f-4510-490c-bc62-d75ef0230241'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:53:22.682 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:22.683 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:22.683 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:22.686 134138 DEBUG nova.scheduler.request_filter [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:53:22.686 134138 DEBUG nova.scheduler.request_filter [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:53:22.686 134138 DEBUG nova.scheduler.request_filter [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:53:22.686 134138 DEBUG nova.scheduler.request_filter [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:53:22.687 134138 DEBUG nova.scheduler.request_filter [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:53:22.691 134138 DEBUG oslo_concurrency.lockutils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:22.692 134138 DEBUG oslo_concurrency.lockutils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:22.738 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:22.738 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:22.738 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:22.751 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 10f1fe49fbf249bc9e174a76918b2ff7 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:53:22.755 134138 DEBUG oslo_concurrency.lockutils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:22.755 134138 DEBUG oslo_concurrency.lockutils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:22.767 134138 DEBUG oslo_concurrency.lockutils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:22.768 134138 DEBUG nova.scheduler.host_manager [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=75,free_ram_mb=29339,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=2,mapped=1,memory_mb=31899,memory_mb_used=2560,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=2,service_id=None,stats={failed_builds='0',io_workload='2',num_instances='2',num_os_type_None='2',num_proj_f33673cce13045b58a0d813caec251fb='2',num_task_None='2',num_vm_building='2'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:53:22Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=2) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:53:22.769 134138 DEBUG nova.scheduler.host_manager [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:53:22.769 134138 DEBUG nova.scheduler.host_manager [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 426, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 53, 22, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 53, 22, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:53:22.769 134138 DEBUG nova.scheduler.host_manager [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:53:22.769 134138 DEBUG oslo_concurrency.lockutils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:22.770 134138 INFO nova.scheduler.host_manager [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-697-1 2026-04-23 01:53:22.770 134138 DEBUG nova.scheduler.manager [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 29339MB disk: 26624MB io_ops: 2 instances: 2]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:53:22.770 134138 DEBUG nova.scheduler.manager [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 29339MB disk: 26624MB io_ops: 2 instances: 2, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:53:22.770 134138 DEBUG nova.scheduler.utils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance 0076cb7f-4510-490c-bc62-d75ef0230241 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:53:22.840 134138 DEBUG nova.scheduler.manager [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: 0076cb7f-4510-490c-bc62-d75ef0230241] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 29339MB disk: 26624MB io_ops: 2 instances: 2 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:53:22.841 134138 DEBUG oslo_concurrency.lockutils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:22.841 134138 DEBUG oslo_concurrency.lockutils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:22.842 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: b0671672e5f0499c97fbf4c9652f8e12 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:53:22.844 134138 DEBUG oslo_messaging._drivers.amqpdriver [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 11a515b6ddb245888a15cad4b7e8ffa9 reply queue: reply_196060a39108420a873d953cb9a1134e time elapsed: 0.16496981000000233s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:53:23.039 134138 DEBUG oslo_service.periodic_task [req-7fcba006-2a98-47b7-ae03-fb4b09ffdd5d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:53:23.043 134138 DEBUG oslo_concurrency.lockutils [req-f331a53b-20f1-4d89-bf1f-089923793932 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:23.044 134138 DEBUG oslo_concurrency.lockutils [req-f331a53b-20f1-4d89-bf1f-089923793932 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:23.069 134146 DEBUG oslo_service.periodic_task [req-4c382421-9c0b-4333-9c8c-141623220080 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:53:23.075 134146 DEBUG oslo_concurrency.lockutils [req-8133a47c-397d-47b7-93ce-28ae2e4102a3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:23.076 134146 DEBUG oslo_concurrency.lockutils [req-8133a47c-397d-47b7-93ce-28ae2e4102a3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:23.536 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:23.537 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:23.537 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:23.684 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:23.684 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:23.685 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:24.464 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:24.464 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:24.464 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:24.969 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 720595d73da74703b9cfe408024e37ae __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:24.969 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 720595d73da74703b9cfe408024e37ae __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:24.969 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:24.969 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:24.970 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 720595d73da74703b9cfe408024e37ae poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:24.970 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 720595d73da74703b9cfe408024e37ae poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:24.970 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 720595d73da74703b9cfe408024e37ae __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:24.970 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:24.970 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:24.970 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:24.970 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:24.970 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 720595d73da74703b9cfe408024e37ae __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:24.970 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:24.970 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 720595d73da74703b9cfe408024e37ae poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:24.970 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:24.970 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 720595d73da74703b9cfe408024e37ae poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:24.971 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:24.971 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:24.971 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:24.971 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:24.973 134146 DEBUG oslo_concurrency.lockutils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:24.973 134145 DEBUG oslo_concurrency.lockutils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:24.973 134146 DEBUG oslo_concurrency.lockutils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:24.973 134145 DEBUG oslo_concurrency.lockutils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:24.973 134138 DEBUG oslo_concurrency.lockutils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:24.973 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:24.973 134138 DEBUG oslo_concurrency.lockutils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:24.973 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:24.974 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:24.973 134140 DEBUG oslo_concurrency.lockutils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:24.974 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:24.974 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:24.974 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:24.974 134140 DEBUG oslo_concurrency.lockutils [req-2c2e43ae-4980-4c6f-8593-2327a27afcd8 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:24.974 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:24.974 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:24.974 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:24.974 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:24.974 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:24.974 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:25.211 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fa37e6e3bc344dc1b965c3aec18df036 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:25.211 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fa37e6e3bc344dc1b965c3aec18df036 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:25.211 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fa37e6e3bc344dc1b965c3aec18df036 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:25.211 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: fa37e6e3bc344dc1b965c3aec18df036 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:25.211 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:25.211 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:25.211 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fa37e6e3bc344dc1b965c3aec18df036 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:25.211 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:25.211 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fa37e6e3bc344dc1b965c3aec18df036 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:25.211 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fa37e6e3bc344dc1b965c3aec18df036 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:25.211 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:25.211 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:25.211 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: fa37e6e3bc344dc1b965c3aec18df036 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:25.211 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:25.211 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:25.211 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:25.212 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:25.212 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:25.212 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:25.212 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:25.213 134138 DEBUG oslo_concurrency.lockutils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:25.213 134140 DEBUG oslo_concurrency.lockutils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:25.214 134138 DEBUG oslo_concurrency.lockutils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:25.214 134145 DEBUG oslo_concurrency.lockutils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:25.214 134146 DEBUG oslo_concurrency.lockutils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:25.214 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:25.214 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:25.214 134145 DEBUG oslo_concurrency.lockutils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:25.214 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:25.214 134146 DEBUG oslo_concurrency.lockutils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:25.214 134140 DEBUG oslo_concurrency.lockutils [req-3125e7b4-9067-41bb-90cd-68287cf37979 b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:25.214 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:25.214 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:25.215 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:25.214 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:25.215 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:25.215 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:25.215 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:25.215 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:25.215 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:26.215 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:26.215 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:26.216 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:26.216 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:26.216 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:26.216 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:26.216 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:26.216 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:26.216 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:26.217 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:26.217 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:26.217 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:26.245 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 30bddacd86cd4954bcc1776ee8388e45 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:26.245 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 30bddacd86cd4954bcc1776ee8388e45 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:26.245 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 30bddacd86cd4954bcc1776ee8388e45 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:26.245 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:26.245 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:26.245 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 30bddacd86cd4954bcc1776ee8388e45 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:26.246 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 30bddacd86cd4954bcc1776ee8388e45 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:26.245 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:26.246 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:26.246 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:26.246 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 30bddacd86cd4954bcc1776ee8388e45 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:26.246 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:26.246 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:26.246 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:26.246 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:26.248 134140 DEBUG oslo_concurrency.lockutils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:26.248 134138 DEBUG oslo_concurrency.lockutils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:26.248 134140 DEBUG oslo_concurrency.lockutils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:26.248 134138 DEBUG oslo_concurrency.lockutils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:26.249 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:26.249 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:26.249 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:26.249 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:26.249 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:26.249 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:26.250 134146 DEBUG oslo_concurrency.lockutils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:26.250 134146 DEBUG oslo_concurrency.lockutils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:26.250 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:26.250 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:26.250 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:26.251 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 30bddacd86cd4954bcc1776ee8388e45 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:26.251 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:26.251 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 30bddacd86cd4954bcc1776ee8388e45 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:26.251 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:26.251 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:26.261 134145 DEBUG oslo_concurrency.lockutils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:26.261 134145 DEBUG oslo_concurrency.lockutils [req-5c26a6ff-6204-4a04-b748-ddfff61c5b9f b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:26.261 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:26.262 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:26.262 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:27.250 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:27.250 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:27.250 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:27.251 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:27.251 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:27.251 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:27.252 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:27.252 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:27.253 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:27.263 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:27.263 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:27.263 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:28.098 134140 DEBUG oslo_service.periodic_task [req-c698a2d9-d001-4240-b596-8bf8044966d5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:53:28.102 134140 DEBUG oslo_concurrency.lockutils [req-12670d01-34a7-430e-aa4f-42f0c656b8bf - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:28.102 134140 DEBUG oslo_concurrency.lockutils [req-12670d01-34a7-430e-aa4f-42f0c656b8bf - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:29.252 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:29.252 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:29.253 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:29.254 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:29.254 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:29.254 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:29.254 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:29.255 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:29.255 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:29.264 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:29.265 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:29.265 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:33.255 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:33.255 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:33.255 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:33.256 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:33.256 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:33.256 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:33.258 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:33.259 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:33.259 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:33.267 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:33.267 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:33.268 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:39.065 134145 DEBUG oslo_service.periodic_task [req-d9094395-6802-45a9-b088-8700b91a8be9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:53:39.069 134145 DEBUG oslo_concurrency.lockutils [req-77c1492a-fbb3-42a0-8907-fbbdd536d8da - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:39.069 134145 DEBUG oslo_concurrency.lockutils [req-77c1492a-fbb3-42a0-8907-fbbdd536d8da - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:41.262 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:41.262 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:41.262 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:41.262 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:41.262 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:41.263 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:41.263 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:41.263 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:41.263 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:41.270 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:41.270 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:41.270 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:53.048 134138 DEBUG oslo_service.periodic_task [req-f331a53b-20f1-4d89-bf1f-089923793932 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:53:53.053 134138 DEBUG oslo_concurrency.lockutils [req-3344fe2a-d2bd-4aa8-87e1-3874d407db9b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:53.053 134138 DEBUG oslo_concurrency.lockutils [req-3344fe2a-d2bd-4aa8-87e1-3874d407db9b - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:53.084 134146 DEBUG oslo_service.periodic_task [req-8133a47c-397d-47b7-93ce-28ae2e4102a3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:53:53.091 134146 DEBUG oslo_concurrency.lockutils [req-13e40c13-7ce5-488c-a748-eb8896d0e0cb - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:53.092 134146 DEBUG oslo_concurrency.lockutils [req-13e40c13-7ce5-488c-a748-eb8896d0e0cb - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:57.264 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:57.265 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:57.265 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:57.265 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:57.266 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:57.266 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:57.269 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:57.270 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:57.270 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:57.272 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:57.272 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:57.272 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:57.861 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 39c31dc6a6fa4218b7ff78c702d9546d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:57.861 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 39c31dc6a6fa4218b7ff78c702d9546d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:57.861 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 39c31dc6a6fa4218b7ff78c702d9546d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:57.862 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 39c31dc6a6fa4218b7ff78c702d9546d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:57.862 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:57.862 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:57.862 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:57.862 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 39c31dc6a6fa4218b7ff78c702d9546d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:57.862 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 39c31dc6a6fa4218b7ff78c702d9546d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:57.862 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 39c31dc6a6fa4218b7ff78c702d9546d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:57.862 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:57.862 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 39c31dc6a6fa4218b7ff78c702d9546d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:57.862 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:57.862 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:57.862 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:57.862 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:57.862 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:57.862 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:57.862 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:57.862 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:57.863 134138 DEBUG oslo_concurrency.lockutils [req-562e5284-ab43-4501-93e3-90db6da0d43c b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:57.863 134145 DEBUG oslo_concurrency.lockutils [req-562e5284-ab43-4501-93e3-90db6da0d43c b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:57.863 134140 DEBUG oslo_concurrency.lockutils [req-562e5284-ab43-4501-93e3-90db6da0d43c b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:57.863 134138 DEBUG oslo_concurrency.lockutils [req-562e5284-ab43-4501-93e3-90db6da0d43c b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:57.863 134140 DEBUG oslo_concurrency.lockutils [req-562e5284-ab43-4501-93e3-90db6da0d43c b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:57.863 134145 DEBUG oslo_concurrency.lockutils [req-562e5284-ab43-4501-93e3-90db6da0d43c b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:57.863 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:57.864 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:57.864 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:57.864 134146 DEBUG oslo_concurrency.lockutils [req-562e5284-ab43-4501-93e3-90db6da0d43c b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:57.864 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:57.864 134146 DEBUG oslo_concurrency.lockutils [req-562e5284-ab43-4501-93e3-90db6da0d43c b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:57.865 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:57.865 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:57.865 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:57.865 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:57.865 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:57.865 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:57.865 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:57.866 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:58.090 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4a140e202d994536b93ac1a9b22310bd __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:58.091 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:58.091 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4a140e202d994536b93ac1a9b22310bd poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:58.091 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:58.091 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:58.092 134146 DEBUG oslo_concurrency.lockutils [req-2ff71ab2-877f-4fd3-a440-e2a08321d4ab b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:58.092 134146 DEBUG oslo_concurrency.lockutils [req-2ff71ab2-877f-4fd3-a440-e2a08321d4ab b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:58.093 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4a140e202d994536b93ac1a9b22310bd __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:58.093 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:58.093 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:58.093 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:58.094 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:58.095 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4a140e202d994536b93ac1a9b22310bd poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:58.095 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:58.095 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:58.095 134140 DEBUG oslo_concurrency.lockutils [req-2ff71ab2-877f-4fd3-a440-e2a08321d4ab b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:58.096 134140 DEBUG oslo_concurrency.lockutils [req-2ff71ab2-877f-4fd3-a440-e2a08321d4ab b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:58.096 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4a140e202d994536b93ac1a9b22310bd __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:58.096 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:58.096 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4a140e202d994536b93ac1a9b22310bd poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:58.097 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:58.097 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:58.098 134145 DEBUG oslo_concurrency.lockutils [req-2ff71ab2-877f-4fd3-a440-e2a08321d4ab b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:58.098 134145 DEBUG oslo_concurrency.lockutils [req-2ff71ab2-877f-4fd3-a440-e2a08321d4ab b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:58.098 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:58.098 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:58.099 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:58.100 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4a140e202d994536b93ac1a9b22310bd __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:53:58.100 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:58.101 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4a140e202d994536b93ac1a9b22310bd poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:53:58.101 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:58.101 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:58.102 134138 DEBUG oslo_concurrency.lockutils [req-2ff71ab2-877f-4fd3-a440-e2a08321d4ab b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:58.102 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:58.102 134138 DEBUG oslo_concurrency.lockutils [req-2ff71ab2-877f-4fd3-a440-e2a08321d4ab b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:58.102 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:58.102 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:58.103 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:58.104 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:58.104 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:58.108 134140 DEBUG oslo_service.periodic_task [req-12670d01-34a7-430e-aa4f-42f0c656b8bf - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:53:58.118 134140 DEBUG oslo_concurrency.lockutils [req-daae9fc3-5c57-4e6a-b6fe-c950554926a8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:53:58.118 134140 DEBUG oslo_concurrency.lockutils [req-daae9fc3-5c57-4e6a-b6fe-c950554926a8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:53:59.095 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:59.095 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:59.095 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:59.102 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:59.102 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:59.102 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:59.104 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:59.104 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:59.105 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:53:59.105 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:53:59.105 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:53:59.105 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:01.097 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:01.098 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:01.098 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:01.105 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:01.105 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:01.105 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:01.105 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:01.106 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:01.106 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:01.109 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:01.110 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:01.110 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:02.364 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: b207ffb8dbd244f6b29df6d6ab7d0e0c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:54:02.364 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: b207ffb8dbd244f6b29df6d6ab7d0e0c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:54:02.364 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: b207ffb8dbd244f6b29df6d6ab7d0e0c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:54:02.364 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:02.364 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:02.366 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b207ffb8dbd244f6b29df6d6ab7d0e0c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:02.366 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:02.366 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:02.366 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b207ffb8dbd244f6b29df6d6ab7d0e0c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:02.366 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:02.366 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:02.366 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:02.365 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: b207ffb8dbd244f6b29df6d6ab7d0e0c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:54:02.367 134138 DEBUG oslo_concurrency.lockutils [req-e4ff44ad-e780-4a44-bb3c-bee2bf15ea3d b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:02.367 134145 DEBUG oslo_concurrency.lockutils [req-e4ff44ad-e780-4a44-bb3c-bee2bf15ea3d b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:02.367 134138 DEBUG oslo_concurrency.lockutils [req-e4ff44ad-e780-4a44-bb3c-bee2bf15ea3d b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:02.367 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:02.367 134145 DEBUG oslo_concurrency.lockutils [req-e4ff44ad-e780-4a44-bb3c-bee2bf15ea3d b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:02.367 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b207ffb8dbd244f6b29df6d6ab7d0e0c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:02.368 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b207ffb8dbd244f6b29df6d6ab7d0e0c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:02.368 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:02.368 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:02.368 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:02.368 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:02.369 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:02.369 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:02.369 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:02.369 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:02.369 134146 DEBUG oslo_concurrency.lockutils [req-e4ff44ad-e780-4a44-bb3c-bee2bf15ea3d b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:02.369 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:02.369 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:02.369 134146 DEBUG oslo_concurrency.lockutils [req-e4ff44ad-e780-4a44-bb3c-bee2bf15ea3d b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:02.370 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:02.370 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:02.370 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:02.369 134140 DEBUG oslo_concurrency.lockutils [req-e4ff44ad-e780-4a44-bb3c-bee2bf15ea3d b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:02.370 134140 DEBUG oslo_concurrency.lockutils [req-e4ff44ad-e780-4a44-bb3c-bee2bf15ea3d b2a2cb3a60944a0c88b4121b13faccce f33673cce13045b58a0d813caec251fb - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:02.372 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:02.372 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:02.372 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:03.371 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:03.371 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:03.371 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:03.371 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:03.371 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:03.371 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:03.371 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:03.372 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:03.372 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:03.373 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:03.374 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:03.374 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:05.372 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:05.372 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:05.372 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:05.373 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:05.373 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:05.373 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:05.373 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:05.373 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:05.374 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:05.375 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:05.375 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:05.375 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:09.373 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:09.378 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:09.423 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:09.423 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:09.378 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:09.379 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:09.424 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:09.424 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:09.424 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:09.424 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:09.425 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:09.425 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:10.052 134145 DEBUG oslo_service.periodic_task [req-77c1492a-fbb3-42a0-8907-fbbdd536d8da - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:54:10.056 134145 DEBUG oslo_concurrency.lockutils [req-a3fe89f5-1ff1-4317-8c0b-c3aceacc43b3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:10.057 134145 DEBUG oslo_concurrency.lockutils [req-a3fe89f5-1ff1-4317-8c0b-c3aceacc43b3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:17.426 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:17.426 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:17.426 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:17.426 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:17.426 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:17.426 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:17.426 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:17.427 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:17.427 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:17.427 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:17.427 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:17.428 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:23.059 134138 DEBUG oslo_service.periodic_task [req-3344fe2a-d2bd-4aa8-87e1-3874d407db9b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:54:23.063 134138 DEBUG oslo_concurrency.lockutils [req-91519398-d2a1-480d-a4d9-8fbd1a1bd3ea - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:23.063 134138 DEBUG oslo_concurrency.lockutils [req-91519398-d2a1-480d-a4d9-8fbd1a1bd3ea - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:23.098 134146 DEBUG oslo_service.periodic_task [req-13e40c13-7ce5-488c-a748-eb8896d0e0cb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:54:23.102 134146 DEBUG oslo_concurrency.lockutils [req-d8592490-f948-4d94-b74c-4bcc4e6db4f1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:23.102 134146 DEBUG oslo_concurrency.lockutils [req-d8592490-f948-4d94-b74c-4bcc4e6db4f1 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:28.125 134140 DEBUG oslo_service.periodic_task [req-daae9fc3-5c57-4e6a-b6fe-c950554926a8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:54:28.129 134140 DEBUG oslo_concurrency.lockutils [req-4981323b-c428-453a-8831-f8bb06f1b11c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:28.129 134140 DEBUG oslo_concurrency.lockutils [req-4981323b-c428-453a-8831-f8bb06f1b11c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:32.630 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 6b061bb969a94663b51f0d31f3929dfc reply to reply_196060a39108420a873d953cb9a1134e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-23 01:54:32.631 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:32.631 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 869ed614ef054c21b50255ff2247c6c9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:32.631 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:32.631 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:32.633 134145 DEBUG nova.scheduler.manager [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting to schedule for instances: ['7bc9fa38-04ef-45aa-9fd0-b595f23c6eeb'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-23 01:54:32.633 134145 DEBUG nova.scheduler.request_filter [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-23 01:54:32.634 134145 DEBUG nova.scheduler.request_filter [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:54:32.634 134145 DEBUG nova.scheduler.request_filter [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:54:32.634 134145 DEBUG nova.scheduler.request_filter [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-23 01:54:32.637 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:32.638 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:32.638 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:32.683 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: f03d5f9b5f654230afa1f78e18eaaaa9 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:54:32.685 134145 DEBUG oslo_concurrency.lockutils [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:32.685 134145 DEBUG oslo_concurrency.lockutils [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:32.694 134145 DEBUG oslo_concurrency.lockutils [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:32.694 134145 DEBUG nova.scheduler.host_manager [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["fpu", "apic", "pku", "pge", "pse", "fsgsbase", "rtm", "mmx", "avx512bw", "cx16", "avx2", "cmov", "xsavec", "clwb", "smap", "clflushopt", "avx512-vpopcntdq", "pat", "lm", "pclmuldq", "msr", "sse4.2", "ssse3", "arat", "adx", "sse2", "fma", "aes", "rdtscp", "avx512vl", "xsaveopt", "3dnowprefetch", "vme", "hle", "sse4.1", "syscall", "vpclmulqdq", "cx8", "lahf_lm", "clflush", "avx512cd", "tsc", "avx512dq", "gfni", "pse36", "wbnoinvd", "sep", "nx", "avx512vbmi", "popcnt", "spec-ctrl", "vaes", "pae", "fxsr", "ssbd", "la57", "avx512f", "hypervisor", "mca", "abm", "avx512vbmi2", "mtrr", "movbe", "pni", "rdseed", "xgetbv1", "pcid", "invpcid", "erms", "ht", "avx512bitalg", "mce", "umip", "de", "rdrand", "pdpe1gb", "smep", "xsave", "bmi1", "bmi2", "avx", "avx512vnni", "x2apic", "tsc-deadline", "f16c", "sse"]}',created_at=2026-04-23T00:42:02Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=23,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-697-1',host_ip=10.0.0.129,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-697-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035207, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["used", "reserved", "total", "size_kb"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["id", "cpu_usage", "memory_usage", "pinned_cpus", "network_metadata", "siblings", "memory", "socket", "pcpuset", "mempages", "cpuset"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_f33673cce13045b58a0d813caec251fb='0',num_task_None='0',num_vm_active='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-23T01:54:02Z,uuid=aa32474e-00f6-4c5b-a2ec-d12a3f31dd05,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-23 01:54:32.695 134145 DEBUG nova.scheduler.host_manager [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-23 01:54:32.695 134145 DEBUG nova.scheduler.host_manager [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with service dict: {'id': 9, 'uuid': '979fb67d-7796-4f44-89f1-b2234e3e1381', 'host': 'cn-jenkins-deploy-platform-juju-os-697-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 433, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 23, 1, 54, 32, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 23, 0, 42, 2, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 23, 1, 54, 32, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-23 01:54:32.696 134145 DEBUG nova.scheduler.host_manager [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-23 01:54:32.696 134145 DEBUG oslo_concurrency.lockutils [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:32.696 134145 DEBUG nova.filters [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Starting with 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:70 2026-04-23 01:54:32.697 134145 DEBUG nova.filters [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filter AvailabilityZoneFilter returned 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:102 2026-04-23 01:54:32.697 134145 DEBUG nova.filters [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filter ComputeFilter returned 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:102 2026-04-23 01:54:32.697 134145 DEBUG nova.filters [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filter ComputeCapabilitiesFilter returned 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:102 2026-04-23 01:54:32.697 134145 DEBUG nova.filters [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filter ImagePropertiesFilter returned 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:102 2026-04-23 01:54:32.698 134145 DEBUG nova.filters [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filter ServerGroupAntiAffinityFilter returned 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:102 2026-04-23 01:54:32.698 134145 DEBUG nova.filters [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filter ServerGroupAffinityFilter returned 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:102 2026-04-23 01:54:32.698 134145 DEBUG nova.filters [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filter DifferentHostFilter returned 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:102 2026-04-23 01:54:32.698 134145 DEBUG nova.filters [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filter SameHostFilter returned 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:102 2026-04-23 01:54:32.698 134145 DEBUG nova.scheduler.manager [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Filtered [(cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 23552MB io_ops: 0 instances: 0] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-23 01:54:32.699 134145 DEBUG nova.scheduler.manager [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 23552MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-23 01:54:32.699 134145 DEBUG nova.scheduler.utils [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Attempting to claim resources in the placement API for instance 7bc9fa38-04ef-45aa-9fd0-b595f23c6eeb claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-23 01:54:32.794 134145 DEBUG nova.scheduler.manager [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] [instance: 7bc9fa38-04ef-45aa-9fd0-b595f23c6eeb] Selected host: (cn-jenkins-deploy-platform-juju-os-697-1, cn-jenkins-deploy-platform-juju-os-697-1) ram: 31387MB disk: 23552MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-23 01:54:32.795 134145 DEBUG oslo_concurrency.lockutils [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:32.795 134145 DEBUG oslo_concurrency.lockutils [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "('cn-jenkins-deploy-platform-juju-os-697-1', 'cn-jenkins-deploy-platform-juju-os-697-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:32.796 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] CAST unique_id: 27a0bde33bfe4e91ab89253894d32ae8 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-23 01:54:32.798 134145 DEBUG oslo_messaging._drivers.amqpdriver [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] sending reply msg_id: 6b061bb969a94663b51f0d31f3929dfc reply queue: reply_196060a39108420a873d953cb9a1134e time elapsed: 0.16761965200021223s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-23 01:54:33.428 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:33.429 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:33.429 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:33.429 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:33.429 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:33.429 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:33.429 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:33.430 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:33.430 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:33.639 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:33.639 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:33.639 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:35.642 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:35.642 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:35.642 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:36.267 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c93dcf6554f342c794d84c93ae830d93 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:54:36.267 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c93dcf6554f342c794d84c93ae830d93 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:54:36.267 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c93dcf6554f342c794d84c93ae830d93 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:54:36.267 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:36.267 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:36.267 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c93dcf6554f342c794d84c93ae830d93 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:36.267 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c93dcf6554f342c794d84c93ae830d93 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:36.267 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:36.267 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:36.267 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:36.267 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c93dcf6554f342c794d84c93ae830d93 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:36.268 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:36.268 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:36.268 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:36.268 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:36.270 134145 DEBUG oslo_concurrency.lockutils [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:36.270 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c93dcf6554f342c794d84c93ae830d93 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:54:36.270 134145 DEBUG oslo_concurrency.lockutils [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:36.270 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:36.270 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:36.270 134146 DEBUG oslo_concurrency.lockutils [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:36.270 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:36.270 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:36.270 134146 DEBUG oslo_concurrency.lockutils [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:36.271 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:36.271 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:36.271 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:36.270 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c93dcf6554f342c794d84c93ae830d93 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:36.271 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:36.272 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:36.271 134138 DEBUG oslo_concurrency.lockutils [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:36.272 134138 DEBUG oslo_concurrency.lockutils [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:36.274 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:36.275 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:36.275 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:36.274 134140 DEBUG oslo_concurrency.lockutils [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:36.275 134140 DEBUG oslo_concurrency.lockutils [req-401559ea-cea0-4c62-8c59-e29932853025 ec9b62c1fcfa43348453cd1dbcd01b1b d5c6a3af7cce4f6984150f25d0a07c5b - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:36.276 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:36.276 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:36.276 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:37.271 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:37.272 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:37.272 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:37.273 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:37.273 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:37.273 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:37.276 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:37.276 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:37.277 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:37.278 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:37.278 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:37.278 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:39.274 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:39.275 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:39.275 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:39.275 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:39.275 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:39.276 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:39.278 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:39.278 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:39.278 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:39.281 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:39.281 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:39.281 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:40.061 134145 DEBUG oslo_service.periodic_task [req-a3fe89f5-1ff1-4317-8c0b-c3aceacc43b3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:54:40.065 134145 DEBUG oslo_concurrency.lockutils [req-27d1872d-638c-4c8b-89e8-ffbcd9112174 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:40.066 134145 DEBUG oslo_concurrency.lockutils [req-27d1872d-638c-4c8b-89e8-ffbcd9112174 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:41.943 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 6ba869337ebc4c7abfdec1662c3bf374 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:54:41.943 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 6ba869337ebc4c7abfdec1662c3bf374 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:54:41.943 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 6ba869337ebc4c7abfdec1662c3bf374 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:54:41.943 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:41.943 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:41.943 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:41.944 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6ba869337ebc4c7abfdec1662c3bf374 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:41.944 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6ba869337ebc4c7abfdec1662c3bf374 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:41.944 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6ba869337ebc4c7abfdec1662c3bf374 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:41.944 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:41.944 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:41.944 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:41.944 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:41.944 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:41.944 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:41.944 134146 DEBUG oslo_concurrency.lockutils [req-f4705d89-2887-464f-904c-4449e3e634cf - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:41.944 134138 DEBUG oslo_concurrency.lockutils [req-f4705d89-2887-464f-904c-4449e3e634cf - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:41.944 134140 DEBUG oslo_concurrency.lockutils [req-f4705d89-2887-464f-904c-4449e3e634cf - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:41.945 134146 DEBUG nova.scheduler.host_manager [req-f4705d89-2887-464f-904c-4449e3e634cf - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:54:41.945 134138 DEBUG nova.scheduler.host_manager [req-f4705d89-2887-464f-904c-4449e3e634cf - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:54:41.945 134140 DEBUG nova.scheduler.host_manager [req-f4705d89-2887-464f-904c-4449e3e634cf - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:54:41.945 134146 DEBUG oslo_concurrency.lockutils [req-f4705d89-2887-464f-904c-4449e3e634cf - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:41.945 134138 DEBUG oslo_concurrency.lockutils [req-f4705d89-2887-464f-904c-4449e3e634cf - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:41.945 134140 DEBUG oslo_concurrency.lockutils [req-f4705d89-2887-464f-904c-4449e3e634cf - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:41.945 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:41.945 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:41.945 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:41.945 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:41.945 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:41.945 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:41.945 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:41.945 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:41.946 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:41.946 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 6ba869337ebc4c7abfdec1662c3bf374 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:54:41.947 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:41.947 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6ba869337ebc4c7abfdec1662c3bf374 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:41.947 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:41.947 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:41.948 134145 DEBUG oslo_concurrency.lockutils [req-f4705d89-2887-464f-904c-4449e3e634cf - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:41.948 134145 DEBUG nova.scheduler.host_manager [req-f4705d89-2887-464f-904c-4449e3e634cf - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:54:41.948 134145 DEBUG oslo_concurrency.lockutils [req-f4705d89-2887-464f-904c-4449e3e634cf - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:41.950 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:41.950 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:41.950 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:42.947 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:42.947 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:42.947 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:42.947 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:42.947 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:42.947 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:42.947 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:42.947 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:42.947 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:42.951 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:42.951 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:42.951 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:44.949 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:44.949 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:44.949 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:44.950 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:44.950 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:44.950 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:44.950 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:44.951 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:44.951 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:44.954 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:44.954 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:44.955 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:48.951 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:48.952 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:48.952 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:48.952 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:48.952 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:48.952 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:48.952 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:48.953 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:48.953 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:48.958 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:48.958 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:48.959 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:53.070 134138 DEBUG oslo_service.periodic_task [req-91519398-d2a1-480d-a4d9-8fbd1a1bd3ea - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:54:53.074 134138 DEBUG oslo_concurrency.lockutils [req-28e49468-79e1-4373-b788-b1edd5fc0f2a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:53.074 134138 DEBUG oslo_concurrency.lockutils [req-28e49468-79e1-4373-b788-b1edd5fc0f2a - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:53.109 134146 DEBUG oslo_service.periodic_task [req-d8592490-f948-4d94-b74c-4bcc4e6db4f1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:54:53.113 134146 DEBUG oslo_concurrency.lockutils [req-2f8d3fa8-6df7-440b-bd94-a4e7ee659796 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:53.114 134146 DEBUG oslo_concurrency.lockutils [req-2f8d3fa8-6df7-440b-bd94-a4e7ee659796 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:53.627 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f8eb314f6a6741c7aa1d92978b71568f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:54:53.627 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f8eb314f6a6741c7aa1d92978b71568f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:54:53.627 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f8eb314f6a6741c7aa1d92978b71568f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:54:53.627 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:53.627 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:53.627 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:53.627 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f8eb314f6a6741c7aa1d92978b71568f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:53.627 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f8eb314f6a6741c7aa1d92978b71568f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:53.627 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f8eb314f6a6741c7aa1d92978b71568f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:53.628 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:53.628 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:53.628 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:53.628 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:53.628 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:53.628 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:53.628 134140 DEBUG oslo_concurrency.lockutils [req-6d8fb232-af31-4d44-aa69-ffaa5fbaa20c c5e2064c098b4632b1506bc2856363b0 6a2607fb1e0c4e3c84d0f6eee5ff75a6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:53.628 134140 DEBUG oslo_concurrency.lockutils [req-6d8fb232-af31-4d44-aa69-ffaa5fbaa20c c5e2064c098b4632b1506bc2856363b0 6a2607fb1e0c4e3c84d0f6eee5ff75a6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:53.629 134145 DEBUG oslo_concurrency.lockutils [req-6d8fb232-af31-4d44-aa69-ffaa5fbaa20c c5e2064c098b4632b1506bc2856363b0 6a2607fb1e0c4e3c84d0f6eee5ff75a6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:53.629 134145 DEBUG oslo_concurrency.lockutils [req-6d8fb232-af31-4d44-aa69-ffaa5fbaa20c c5e2064c098b4632b1506bc2856363b0 6a2607fb1e0c4e3c84d0f6eee5ff75a6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:53.629 134146 DEBUG oslo_concurrency.lockutils [req-6d8fb232-af31-4d44-aa69-ffaa5fbaa20c c5e2064c098b4632b1506bc2856363b0 6a2607fb1e0c4e3c84d0f6eee5ff75a6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:53.629 134146 DEBUG oslo_concurrency.lockutils [req-6d8fb232-af31-4d44-aa69-ffaa5fbaa20c c5e2064c098b4632b1506bc2856363b0 6a2607fb1e0c4e3c84d0f6eee5ff75a6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:53.629 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f8eb314f6a6741c7aa1d92978b71568f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:54:53.629 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:53.630 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:53.630 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:53.630 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:53.630 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:53.630 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:53.630 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:53.630 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:53.630 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f8eb314f6a6741c7aa1d92978b71568f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:54:53.630 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:53.630 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:53.630 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:53.631 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:53.631 134138 DEBUG oslo_concurrency.lockutils [req-6d8fb232-af31-4d44-aa69-ffaa5fbaa20c c5e2064c098b4632b1506bc2856363b0 6a2607fb1e0c4e3c84d0f6eee5ff75a6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:53.631 134138 DEBUG oslo_concurrency.lockutils [req-6d8fb232-af31-4d44-aa69-ffaa5fbaa20c c5e2064c098b4632b1506bc2856363b0 6a2607fb1e0c4e3c84d0f6eee5ff75a6 - 463c1a7f457542babd061a9f36cbf244 463c1a7f457542babd061a9f36cbf244] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:54:53.633 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:53.633 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:53.633 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:54.631 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:54.631 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:54.631 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:54.631 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:54.632 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:54.632 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:54.632 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:54.632 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:54.632 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:54.634 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:54.635 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:54.635 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:56.633 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:56.633 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:56.633 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:56.633 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:56.633 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:56.634 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:56.634 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:56.634 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:56.634 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:56.637 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:54:56.637 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:54:56.638 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:54:58.135 134140 DEBUG oslo_service.periodic_task [req-4981323b-c428-453a-8831-f8bb06f1b11c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:54:58.139 134140 DEBUG oslo_concurrency.lockutils [req-daa23f49-987e-4397-ad3c-27335ce11967 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:54:58.139 134140 DEBUG oslo_concurrency.lockutils [req-daa23f49-987e-4397-ad3c-27335ce11967 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:55:00.636 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:55:00.637 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:55:00.637 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:55:00.637 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:55:00.637 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:55:00.637 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:55:00.638 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:55:00.638 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:55:00.639 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:55:00.642 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:55:00.642 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:55:00.642 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:55:08.640 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:55:08.640 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:55:08.641 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:55:08.641 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:55:08.641 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:55:08.641 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:55:08.642 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:55:08.643 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:55:08.643 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:55:08.646 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:55:08.646 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:55:08.647 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:55:10.072 134145 DEBUG oslo_service.periodic_task [req-27d1872d-638c-4c8b-89e8-ffbcd9112174 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:55:10.077 134145 DEBUG oslo_concurrency.lockutils [req-bc9eb858-bdf1-4b63-b7b4-021993aa7ec8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:55:10.077 134145 DEBUG oslo_concurrency.lockutils [req-bc9eb858-bdf1-4b63-b7b4-021993aa7ec8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:55:23.120 134146 DEBUG oslo_service.periodic_task [req-2f8d3fa8-6df7-440b-bd94-a4e7ee659796 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:55:23.124 134146 DEBUG oslo_concurrency.lockutils [req-29c15b53-62ff-443e-8649-7584a1765b1d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:55:23.125 134146 DEBUG oslo_concurrency.lockutils [req-29c15b53-62ff-443e-8649-7584a1765b1d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:55:24.019 134138 DEBUG oslo_service.periodic_task [req-28e49468-79e1-4373-b788-b1edd5fc0f2a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:55:24.023 134138 DEBUG oslo_concurrency.lockutils [req-0eca9290-9c2a-4514-af15-63d4948844db - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:55:24.024 134138 DEBUG oslo_concurrency.lockutils [req-0eca9290-9c2a-4514-af15-63d4948844db - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:55:24.642 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:55:24.642 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:55:24.642 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:55:24.643 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:55:24.643 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:55:24.643 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:55:24.644 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:55:24.645 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:55:24.645 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:55:24.648 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:55:24.648 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:55:24.649 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:55:28.145 134140 DEBUG oslo_service.periodic_task [req-daa23f49-987e-4397-ad3c-27335ce11967 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:55:28.150 134140 DEBUG oslo_concurrency.lockutils [req-936ece5d-a118-44d4-80f9-dd7721f02c31 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:55:28.150 134140 DEBUG oslo_concurrency.lockutils [req-936ece5d-a118-44d4-80f9-dd7721f02c31 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:55:41.052 134145 DEBUG oslo_service.periodic_task [req-bc9eb858-bdf1-4b63-b7b4-021993aa7ec8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:55:41.057 134145 DEBUG oslo_concurrency.lockutils [req-63a257cb-2f60-4389-960a-c13d9ed38e40 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:55:41.057 134145 DEBUG oslo_concurrency.lockutils [req-63a257cb-2f60-4389-960a-c13d9ed38e40 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:55:53.130 134146 DEBUG oslo_service.periodic_task [req-29c15b53-62ff-443e-8649-7584a1765b1d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:55:53.134 134146 DEBUG oslo_concurrency.lockutils [req-9cbb7785-0d12-440c-bdf4-eb69e3799578 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:55:53.135 134146 DEBUG oslo_concurrency.lockutils [req-9cbb7785-0d12-440c-bdf4-eb69e3799578 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:55:54.033 134138 DEBUG oslo_service.periodic_task [req-0eca9290-9c2a-4514-af15-63d4948844db - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:55:54.037 134138 DEBUG oslo_concurrency.lockutils [req-52b08a0c-af8a-44e6-88b3-d2b39027590d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:55:54.038 134138 DEBUG oslo_concurrency.lockutils [req-52b08a0c-af8a-44e6-88b3-d2b39027590d - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:55:58.157 134140 DEBUG oslo_service.periodic_task [req-936ece5d-a118-44d4-80f9-dd7721f02c31 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:55:58.161 134140 DEBUG oslo_concurrency.lockutils [req-9f101976-6aa0-483b-bb10-35682c7c151c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:55:58.162 134140 DEBUG oslo_concurrency.lockutils [req-9f101976-6aa0-483b-bb10-35682c7c151c - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:56:03.139 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:03.139 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:03.139 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:03.171 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:03.172 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:03.172 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:03.175 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:03.175 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:03.176 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:03.205 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:03.205 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:03.205 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:11.066 134145 DEBUG oslo_service.periodic_task [req-63a257cb-2f60-4389-960a-c13d9ed38e40 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:56:11.070 134145 DEBUG oslo_concurrency.lockutils [req-d29e323d-c5e4-44ef-80d5-4de91f76e9d3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:56:11.070 134145 DEBUG oslo_concurrency.lockutils [req-d29e323d-c5e4-44ef-80d5-4de91f76e9d3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:56:24.046 134138 DEBUG oslo_service.periodic_task [req-52b08a0c-af8a-44e6-88b3-d2b39027590d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:56:24.050 134138 DEBUG oslo_concurrency.lockutils [req-f23bf363-5110-4791-95aa-c5f23b58e642 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:56:24.050 134138 DEBUG oslo_concurrency.lockutils [req-f23bf363-5110-4791-95aa-c5f23b58e642 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:56:24.070 134146 DEBUG oslo_service.periodic_task [req-9cbb7785-0d12-440c-bdf4-eb69e3799578 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:56:24.074 134146 DEBUG oslo_concurrency.lockutils [req-fec4c78d-e3c8-4752-85e6-16ab4d4220c8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:56:24.074 134146 DEBUG oslo_concurrency.lockutils [req-fec4c78d-e3c8-4752-85e6-16ab4d4220c8 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:56:28.168 134140 DEBUG oslo_service.periodic_task [req-9f101976-6aa0-483b-bb10-35682c7c151c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:56:28.172 134140 DEBUG oslo_concurrency.lockutils [req-23e0a0bd-b264-40b2-824b-54ce7946fb33 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:56:28.172 134140 DEBUG oslo_concurrency.lockutils [req-23e0a0bd-b264-40b2-824b-54ce7946fb33 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:56:41.079 134145 DEBUG oslo_service.periodic_task [req-d29e323d-c5e4-44ef-80d5-4de91f76e9d3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:56:41.083 134145 DEBUG oslo_concurrency.lockutils [req-7f00009d-8d92-4c96-aeb2-9d11be723121 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:56:41.083 134145 DEBUG oslo_concurrency.lockutils [req-7f00009d-8d92-4c96-aeb2-9d11be723121 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:56:42.906 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8d325755dd804017b1c6ba5f0283741b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:56:42.906 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8d325755dd804017b1c6ba5f0283741b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:56:42.906 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8d325755dd804017b1c6ba5f0283741b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:56:42.906 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8d325755dd804017b1c6ba5f0283741b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-23 01:56:42.906 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:42.906 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:42.906 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:42.906 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8d325755dd804017b1c6ba5f0283741b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:56:42.906 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8d325755dd804017b1c6ba5f0283741b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:56:42.906 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8d325755dd804017b1c6ba5f0283741b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:56:42.906 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:42.906 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:42.906 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:42.906 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:42.906 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:42.906 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:42.906 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8d325755dd804017b1c6ba5f0283741b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-23 01:56:42.907 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:42.907 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:42.907 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:42.907 134140 DEBUG oslo_concurrency.lockutils [req-e8ee3753-978f-4359-9a6c-c007a5f1a2cd - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:56:42.907 134138 DEBUG oslo_concurrency.lockutils [req-e8ee3753-978f-4359-9a6c-c007a5f1a2cd - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:56:42.907 134145 DEBUG oslo_concurrency.lockutils [req-e8ee3753-978f-4359-9a6c-c007a5f1a2cd - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:56:42.907 134140 DEBUG nova.scheduler.host_manager [req-e8ee3753-978f-4359-9a6c-c007a5f1a2cd - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:56:42.907 134138 DEBUG nova.scheduler.host_manager [req-e8ee3753-978f-4359-9a6c-c007a5f1a2cd - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:56:42.907 134145 DEBUG nova.scheduler.host_manager [req-e8ee3753-978f-4359-9a6c-c007a5f1a2cd - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:56:42.907 134140 DEBUG oslo_concurrency.lockutils [req-e8ee3753-978f-4359-9a6c-c007a5f1a2cd - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:56:42.908 134138 DEBUG oslo_concurrency.lockutils [req-e8ee3753-978f-4359-9a6c-c007a5f1a2cd - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:56:42.908 134145 DEBUG oslo_concurrency.lockutils [req-e8ee3753-978f-4359-9a6c-c007a5f1a2cd - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:56:42.908 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:42.908 134146 DEBUG oslo_concurrency.lockutils [req-e8ee3753-978f-4359-9a6c-c007a5f1a2cd - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:56:42.908 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:42.908 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:42.908 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:42.908 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:42.908 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:42.908 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:42.908 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:42.908 134146 DEBUG nova.scheduler.host_manager [req-e8ee3753-978f-4359-9a6c-c007a5f1a2cd - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-697-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-23 01:56:42.908 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:42.908 134146 DEBUG oslo_concurrency.lockutils [req-e8ee3753-978f-4359-9a6c-c007a5f1a2cd - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:56:42.909 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:42.909 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:42.909 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:43.909 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:43.909 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:43.909 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:43.909 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:43.910 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:43.910 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:43.910 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:43.910 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:43.910 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:43.911 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:43.911 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:43.911 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:45.911 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:45.912 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:45.912 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:45.912 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:45.912 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:45.912 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:45.912 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:45.913 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:45.913 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:45.913 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:45.913 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:45.914 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:49.913 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:49.913 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:49.914 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:49.914 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:49.914 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:49.914 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:49.916 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:49.916 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:49.916 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:49.916 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:49.917 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:49.917 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:54.081 134146 DEBUG oslo_service.periodic_task [req-fec4c78d-e3c8-4752-85e6-16ab4d4220c8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:56:54.085 134146 DEBUG oslo_concurrency.lockutils [req-508d0943-0b1f-4e0d-81be-2f99d2295473 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:56:54.085 134146 DEBUG oslo_concurrency.lockutils [req-508d0943-0b1f-4e0d-81be-2f99d2295473 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:56:55.020 134138 DEBUG oslo_service.periodic_task [req-f23bf363-5110-4791-95aa-c5f23b58e642 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:56:55.024 134138 DEBUG oslo_concurrency.lockutils [req-b9231f2f-2e79-49aa-afef-82015895cecd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:56:55.024 134138 DEBUG oslo_concurrency.lockutils [req-b9231f2f-2e79-49aa-afef-82015895cecd - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:56:57.914 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:57.915 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:57.915 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:57.917 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:57.918 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:57.918 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:57.918 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:57.918 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:57.918 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:57.920 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:56:57.920 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:56:57.920 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:56:58.178 134140 DEBUG oslo_service.periodic_task [req-23e0a0bd-b264-40b2-824b-54ce7946fb33 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:56:58.182 134140 DEBUG oslo_concurrency.lockutils [req-1f72f964-d56e-440a-a027-bd292efafdfa - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:56:58.183 134140 DEBUG oslo_concurrency.lockutils [req-1f72f964-d56e-440a-a027-bd292efafdfa - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:57:12.052 134145 DEBUG oslo_service.periodic_task [req-7f00009d-8d92-4c96-aeb2-9d11be723121 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:57:12.057 134145 DEBUG oslo_concurrency.lockutils [req-ebc6db3a-982e-4319-a3d0-7e93817ba0ae - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:57:12.058 134145 DEBUG oslo_concurrency.lockutils [req-ebc6db3a-982e-4319-a3d0-7e93817ba0ae - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:57:13.916 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:57:13.916 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:57:13.917 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:57:13.920 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:57:13.920 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:57:13.920 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:57:13.920 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:57:13.920 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:57:13.920 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:57:13.922 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:57:13.923 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:57:13.923 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:57:24.092 134146 DEBUG oslo_service.periodic_task [req-508d0943-0b1f-4e0d-81be-2f99d2295473 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:57:24.097 134146 DEBUG oslo_concurrency.lockutils [req-7de46f87-217b-4e3c-b73e-7f4e19790ab3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:57:24.097 134146 DEBUG oslo_concurrency.lockutils [req-7de46f87-217b-4e3c-b73e-7f4e19790ab3 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:57:25.034 134138 DEBUG oslo_service.periodic_task [req-b9231f2f-2e79-49aa-afef-82015895cecd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:57:25.038 134138 DEBUG oslo_concurrency.lockutils [req-a96fd69c-aa75-416d-be41-7a091a3098ad - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:57:25.039 134138 DEBUG oslo_concurrency.lockutils [req-a96fd69c-aa75-416d-be41-7a091a3098ad - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:57:28.191 134140 DEBUG oslo_service.periodic_task [req-1f72f964-d56e-440a-a027-bd292efafdfa - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:57:28.195 134140 DEBUG oslo_concurrency.lockutils [req-4b638319-aa47-4290-b41b-f904b601ff17 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:57:28.195 134140 DEBUG oslo_concurrency.lockutils [req-4b638319-aa47-4290-b41b-f904b601ff17 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:57:43.052 134145 DEBUG oslo_service.periodic_task [req-ebc6db3a-982e-4319-a3d0-7e93817ba0ae - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-23 01:57:43.056 134145 DEBUG oslo_concurrency.lockutils [req-a8f6f127-adba-48aa-b251-f3c5b4e58eb4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-23 01:57:43.056 134145 DEBUG oslo_concurrency.lockutils [req-a8f6f127-adba-48aa-b251-f3c5b4e58eb4 - - - - -] Lock "de8a22a1-b255-45ae-baf4-856041bc1c3f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-23 01:57:45.917 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:57:45.918 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:57:45.918 134138 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:57:45.924 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:57:45.924 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:57:45.924 134145 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:57:45.930 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:57:45.930 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:57:45.930 134140 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-23 01:57:45.932 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-23 01:57:45.933 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-23 01:57:45.933 134146 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357