Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Advik-7
/
First_Model
like
0
Safetensors
bart
Model card
Files
Files and versions
xet
Community
main
First_Model
1.68 GB
1 contributor
History:
3 commits
Advik-7
Upload model and tokenizer files
3e3e028
verified
over 1 year ago
.gitattributes
Safe
1.52 kB
initial commit
over 1 year ago
config.json
Safe
1.68 kB
Upload folder using huggingface_hub
over 1 year ago
generation_config.json
Safe
234 Bytes
Upload folder using huggingface_hub
over 1 year ago
merges.txt
Safe
456 kB
Upload model and tokenizer files
over 1 year ago
model.safetensors
Safe
558 MB
xet
Upload folder using huggingface_hub
over 1 year ago
optimizer.pt
1.12 GB
xet
Upload model and tokenizer files
over 1 year ago
rng_state.pth
pickle
Detected Pickle imports (7)
"_codecs.encode"
,
"numpy.core.multiarray._reconstruct"
,
"numpy.ndarray"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.ByteStorage"
,
"collections.OrderedDict"
,
"numpy.dtype"
How to fix it?
14.2 kB
xet
Upload model and tokenizer files
over 1 year ago
scheduler.pt
Safe
pickle
Pickle imports
No problematic imports detected
What is a pickle import?
1.06 kB
xet
Upload model and tokenizer files
over 1 year ago
special_tokens_map.json
Safe
964 Bytes
Upload model and tokenizer files
over 1 year ago
tokenizer.json
Safe
3.56 MB
Upload model and tokenizer files
over 1 year ago
tokenizer_config.json
Safe
1.25 kB
Upload model and tokenizer files
over 1 year ago
trainer_state.json
Safe
1.27 kB
Upload model and tokenizer files
over 1 year ago
training_args.bin
pickle
Detected Pickle imports (9)
"torch.device"
,
"transformers.trainer_utils.IntervalStrategy"
,
"transformers.trainer_utils.HubStrategy"
,
"transformers.trainer_pt_utils.AcceleratorConfig"
,
"accelerate.state.PartialState"
,
"accelerate.utils.dataclasses.DistributedType"
,
"transformers.training_args.OptimizerNames"
,
"transformers.training_args_seq2seq.Seq2SeqTrainingArguments"
,
"transformers.trainer_utils.SchedulerType"
How to fix it?
5.37 kB
xet
Upload model and tokenizer files
over 1 year ago
vocab.json
Safe
798 kB
Upload model and tokenizer files
over 1 year ago