|
|
| --- |
| |
| base_model: ProTrekHub/Protein_Encoder_35M |
|
|
| library_name: peft |
|
|
| --- |
| |
|
|
|
|
| # Model Card for Model-RASH-ProTrek-35M |
| <slot name='description'> |
|
|
| ## Task type |
| Protein-level Regression |
|
|
| **Dataset:** [SeprotHub/Dataset-RASH_HUMAN](https://huggingface.co/datasets/SeprotHub/Dataset-RASH_HUMAN) |
| ## Model input type |
| AA Sequence |
|
|
| ## LoRA config |
|
|
| - **r:** 8 |
| - **lora_dropout:** 0.0 |
| - **lora_alpha:** 16 |
| - **target_modules:** ['key', 'output.dense', 'value', 'intermediate.dense', 'query'] |
| - **modules_to_save:** ['classifier'] |
| |
| ## Training config |
| |
| - **optimizer:** |
| - **class:** AdamW |
| - **betas:** (0.9, 0.98) |
| - **weight_decay:** 0.01 |
| - **learning rate:** 0.0005 |
| - **epoch:** 1 |
| - **batch size:** 32 |
| - **precision:** 16-mixed |
|
|
|
|