# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("marco-molinari/python-code-millenials-1b", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("marco-molinari/python-code-millenials-1b", trust_remote_code=True)Quick Links
Model Card for Model ID
Model Details
Model Description
I fine tune: code-millenials-1b on the provided dataset. The model is good at conding and small enough to allow portability, but not trained on python specifically. I fine tune on python.
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by: Marco Molinari
- Language(s) (NLP): Python
- Finetuned from model [optional]: code-millenials-1b
Model Sources [optional]
Uses
Light weight python coding
Training Data
https://huggingface.co/datasets/ArtifactAI/arxiv_python_research_code
- Downloads last month
- 10
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="marco-molinari/python-code-millenials-1b", trust_remote_code=True)