Hi Sarubi,
Sorry for the late reply.
load_adapter(adapter_name_or_path: str, config: Union[dict, str] = None, version: str = None, model_name: str = None, load_as: str = None, source: str = None, with_head: bool = True, custom_weights_loaders: Optional[List[transformers.adapters.loading.WeightsLoader]] = None, leave_out: Optional[List[int]] = None, id2label=None, set_active: bool = False, **kwargs) → str
The "leave_out" parameter allows you to specify a list of layers where you do not want the adapters to be applied. You can find more details on the
MBart page on the AdapterHub documentation.
If you need further assistance, feel free to reach out.