File size: 909 Bytes
b1a2de9 0de3547 b1a2de9 7f80cf1 b1a2de9 7f80cf1 b1a2de9 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 |
---
base_model: google/gemma-2-27b-it
library_name: peft
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** Anton Bazdyrev, Ivan Bashtovyi, Ivan Havlytskyi, Oleksandr Kharytonov, Artur Khodakhovskyi at National Technical University of Ukraine "Igor Sikorsky Kyiv Polytechnic Institute"
- **Finetuned from model [optional]:** google/gemma-2-27b-it with unmasking
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/AntonBazdyrev/unlp2025_shared_task/blob/master/llm_encoder_pretrain/gemma2_27b_pretrain-mlm.ipynb
- **Paper [optional]:** TBD
- **Demo [optional]:** https://github.com/AntonBazdyrev/unlp2025_shared_task/tree/master/span_ident
### Framework versions
- PEFT 0.15.0 |