Commit
·
c0a69b2
1
Parent(s):
67fd6ae
Update README.md
Browse files
README.md
CHANGED
|
@@ -45,17 +45,19 @@ metrics:
|
|
| 45 |
## Usage
|
| 46 |
|
| 47 |
The easiest way to use this model locally is via the [Transformers](https://huggingface.co/docs/transformers/index) library [pipelines for inference](https://huggingface.co/docs/transformers/pipeline_tutorial).
|
| 48 |
-
|
|
|
|
|
|
|
| 49 |
|
| 50 |
|
| 51 |
```
|
| 52 |
from transformers import pipeline
|
| 53 |
|
| 54 |
-
classifier = pipeline('text-classification', "
|
| 55 |
classifier(text)
|
| 56 |
```
|
| 57 |
|
| 58 |
-
This will return predictions in the following format
|
| 59 |
|
| 60 |
```
|
| 61 |
[{'label': 'no_jim_crow', 'score': 0.9718555212020874}]
|
|
|
|
| 45 |
## Usage
|
| 46 |
|
| 47 |
The easiest way to use this model locally is via the [Transformers](https://huggingface.co/docs/transformers/index) library [pipelines for inference](https://huggingface.co/docs/transformers/pipeline_tutorial).
|
| 48 |
+
|
| 49 |
+
Once you have [installed transformers](https://huggingface.co/docs/transformers/installation), you can run the following code.
|
| 50 |
+
This will download and cache the model locally and allow you to make predictions on text input.
|
| 51 |
|
| 52 |
|
| 53 |
```
|
| 54 |
from transformers import pipeline
|
| 55 |
|
| 56 |
+
classifier = pipeline('text-classification', "biglam/autotrain-beyond-the-books")
|
| 57 |
classifier(text)
|
| 58 |
```
|
| 59 |
|
| 60 |
+
This will return predictions in the following format:
|
| 61 |
|
| 62 |
```
|
| 63 |
[{'label': 'no_jim_crow', 'score': 0.9718555212020874}]
|