Image Feature Extraction
Transformers
Safetensors
feature-extraction
custom_code

transformers 5.0.0 bug

#6
by tlpss - opened

Hi

When using transformers 5.0.0 i get following error for the snippet on the model card:

model.safetensors: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 393M/393M [00:04<00:00, 87.6MB/s]
Loading weights: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 175/175 [00:00<00:00, 7659.03it/s, Materializing param=radio_model.summary_idxs]
Traceback (most recent call last):
  File "/home/tlips/incar_ws/keypoint-imitation-learning/representation-extractor/representation_extractor/keypoints/vit_featurizer.py", line 230, in <module>
    model = AutoModel.from_pretrained(hf_repo, trust_remote_code=True)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/tlips/incar_ws/venv/lib/python3.12/site-packages/transformers/models/auto/auto_factory.py", line 366, in from_pretrained
    return model_class.from_pretrained(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/tlips/incar_ws/venv/lib/python3.12/site-packages/transformers/modeling_utils.py", line 250, in _wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/tlips/incar_ws/venv/lib/python3.12/site-packages/transformers/modeling_utils.py", line 4001, in from_pretrained
    model, missing_keys, unexpected_keys, mismatched_keys, offload_index, error_msgs = cls._load_pretrained_model(
                                                                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/tlips/incar_ws/venv/lib/python3.12/site-packages/transformers/modeling_utils.py", line 4161, in _load_pretrained_model
    model.mark_tied_weights_as_initialized()
  File "/home/tlips/incar_ws/venv/lib/python3.12/site-packages/transformers/modeling_utils.py", line 4488, in mark_tied_weights_as_initialized
    for tied_param in self.all_tied_weights_keys.keys():
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/tlips/incar_ws/venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1964, in __getattr__
    raise AttributeError(
AttributeError: 'RADIOModel' object has no attribute 'all_tied_weights_keys'. Did you mean: '_tied_weights_keys'?

I believe this is related to https://github.com/huggingface/transformers/issues/40822#issuecomment-3667746429

Any chance this would be fixed in the near future? And if not, suggestions to work around it?

Update: cf solution below

I managed to fix it locally by adding

super().post_init()

at the end of the init in the hf_model.py file.

This is based on https://github.com/huggingface/transformers/issues/40822#issuecomment-3669461058

Sign up or log in to comment