itay1itzhak nielsr HF Staff commited on
Commit
e708eb8
·
verified ·
1 Parent(s): 0fb9688

Update license and add project page link (#1)

Browse files

- Update license and add project page link (ef3904d5aa97092d8572615a359e43fa35c32b11)


Co-authored-by: Niels Rogge <nielsr@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +11 -10
README.md CHANGED
@@ -1,20 +1,20 @@
1
  ---
2
- license: apache-2.0
3
- tags:
4
- - language-modeling
5
- - causal-lm
6
- - bias-analysis
7
- - cognitive-bias
8
  datasets:
9
  - allenai/tulu-v2-sft-mixture
10
  language:
11
  - en
 
 
12
  metrics:
13
  - accuracy
14
- base_model:
15
- - allenai/OLMo-7B
16
  pipeline_tag: text-generation
17
- library_name: transformers
 
 
 
 
18
  ---
19
 
20
  # Model Card for OLMo-Tulu
@@ -29,9 +29,10 @@ This is one of 3 identical versions trained with different random seeds.
29
 
30
  - **Model type**: Causal decoder-based transformer
31
  - **Language(s)**: English
32
- - **License**: Apache 2.0
33
  - **Finetuned from**: `allenai/OLMo-7B`
34
  - **Paper**: https://arxiv.org/abs/2507.07186
 
35
  - **Repository**: https://github.com/itay1itzhak/planted-in-pretraining
36
 
37
  ## Uses
 
1
  ---
2
+ base_model:
3
+ - allenai/OLMo-7B
 
 
 
 
4
  datasets:
5
  - allenai/tulu-v2-sft-mixture
6
  language:
7
  - en
8
+ library_name: transformers
9
+ license: mit
10
  metrics:
11
  - accuracy
 
 
12
  pipeline_tag: text-generation
13
+ tags:
14
+ - language-modeling
15
+ - causal-lm
16
+ - bias-analysis
17
+ - cognitive-bias
18
  ---
19
 
20
  # Model Card for OLMo-Tulu
 
29
 
30
  - **Model type**: Causal decoder-based transformer
31
  - **Language(s)**: English
32
+ - **License**: MIT
33
  - **Finetuned from**: `allenai/OLMo-7B`
34
  - **Paper**: https://arxiv.org/abs/2507.07186
35
+ - **Project page**: https://itay1itzhak.github.io/planted-in-pretraining
36
  - **Repository**: https://github.com/itay1itzhak/planted-in-pretraining
37
 
38
  ## Uses