Dataset Viewer
Auto-converted to Parquet Duplicate
url
stringlengths
62
66
repository_url
stringclasses
1 value
labels_url
stringlengths
76
80
comments_url
stringlengths
71
75
events_url
stringlengths
69
73
html_url
stringlengths
50
56
id
int64
377M
2.15B
node_id
stringlengths
18
32
number
int64
1
29.2k
title
stringlengths
1
487
user
dict
labels
list
state
stringclasses
2 values
locked
bool
2 classes
assignee
dict
assignees
list
comments
sequence
created_at
int64
1.54k
1.71k
updated_at
int64
1.54k
1.71k
closed_at
int64
1.54k
1.71k
author_association
stringclasses
4 values
active_lock_reason
stringclasses
2 values
body
stringlengths
0
234k
reactions
dict
timeline_url
stringlengths
71
75
state_reason
stringclasses
3 values
draft
bool
2 classes
pull_request
dict
https://api.github.com/repos/huggingface/transformers/issues/29161
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29161/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29161/comments
https://api.github.com/repos/huggingface/transformers/issues/29161/events
https://github.com/huggingface/transformers/issues/29161
2,145,902,969
I_kwDOCUB6oc5_5-F5
29,161
To enter token in jupyter notebook issue
{ "login": "arda1906", "id": 157398066, "node_id": "U_kgDOCWG0Mg", "avatar_url": "https://avatars.githubusercontent.com/u/157398066?v=4", "gravatar_id": "", "url": "https://api.github.com/users/arda1906", "html_url": "https://github.com/arda1906", "followers_url": "https://api.github.com/users/arda1906/...
[]
open
false
null
[]
[ "Hi @arda1906, thanks for raising an issue!\r\n\r\nWithout more information about the error i.e. what does it mean to \"not work\" and what is the expected behaviour? we won't be able to help you. \r\n\r\nFrom the snippet, it's not entirely clear how the code is being run, but there are two separate commands which...
1,708
1,708
null
NONE
null
I run this [from huggingface_hub import notebook_login notebook_login() ] on cell and enter my token. but it doesn't work:(
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29161/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29161/timeline
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/29160
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29160/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29160/comments
https://api.github.com/repos/huggingface/transformers/issues/29160/events
https://github.com/huggingface/transformers/pull/29160
2,145,779,053
PR_kwDOCUB6oc5neHY8
29,160
[WIP] add Fusion In Decoder model
{ "login": "oh-gnues-iohc", "id": 79557937, "node_id": "MDQ6VXNlcjc5NTU3OTM3", "avatar_url": "https://avatars.githubusercontent.com/u/79557937?v=4", "gravatar_id": "", "url": "https://api.github.com/users/oh-gnues-iohc", "html_url": "https://github.com/oh-gnues-iohc", "followers_url": "https://api.githu...
[]
open
false
null
[]
[]
1,708
1,708
null
NONE
null
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29160/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29160/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29160", "html_url": "https://github.com/huggingface/transformers/pull/29160", "diff_url": "https://github.com/huggingface/transformers/pull/29160.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29160.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29159
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29159/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29159/comments
https://api.github.com/repos/huggingface/transformers/issues/29159/events
https://github.com/huggingface/transformers/issues/29159
2,145,650,790
I_kwDOCUB6oc5_5Ahm
29,159
[tokenizer] Inconsistent behavior in slow tokenizer and fast tokenizer
{ "login": "Ki-Seki", "id": 60967965, "node_id": "MDQ6VXNlcjYwOTY3OTY1", "avatar_url": "https://avatars.githubusercontent.com/u/60967965?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Ki-Seki", "html_url": "https://github.com/Ki-Seki", "followers_url": "https://api.github.com/users/Ki-Sek...
[ { "id": 2392046359, "node_id": "MDU6TGFiZWwyMzkyMDQ2MzU5", "url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20Second%20Issue", "name": "Good Second Issue", "color": "dd935a", "default": false, "description": "Issues that are more difficult to do than \"Good First...
open
false
null
[]
[ "Hey! Thanks for opening an issue. \r\nFew things first. You are using a custom / local checkpoint with trust remote code. \r\n\r\nFast is not erroring out when you feed OOV, while slow is and it is indeed inconsistent. Would you like to open a PR for a fix? 🤗 ", "Yes, I'll try that. Thank you for your reply!" ]
1,708
1,708
null
CONTRIBUTOR
null
### System Info - `transformers` version: 4.35.2 - Platform: Linux-5.4.0-163-generic-x86_64-with-glibc2.10 - Python version: 3.8.18 - Huggingface_hub version: 0.19.4 - Safetensors version: 0.4.1 - Accelerate version: not installed - Accelerate config: not found - PyTorch version (GPU?): 2.1.1+cu121 (True) - ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29159/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29159/timeline
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/29158
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29158/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29158/comments
https://api.github.com/repos/huggingface/transformers/issues/29158/events
https://github.com/huggingface/transformers/pull/29158
2,145,552,337
PR_kwDOCUB6oc5ndVY6
29,158
[PyTorch/XLA] Fix extra TPU compilations introduced by recent changes
{ "login": "alanwaketan", "id": 8573935, "node_id": "MDQ6VXNlcjg1NzM5MzU=", "avatar_url": "https://avatars.githubusercontent.com/u/8573935?v=4", "gravatar_id": "", "url": "https://api.github.com/users/alanwaketan", "html_url": "https://github.com/alanwaketan", "followers_url": "https://api.github.com/us...
[]
open
false
null
[]
[]
1,708
1,708
null
CONTRIBUTOR
null
# What does this PR do? This PR tries to fix some extra TPU compilations caused by recent HF changes. 1. PyTorch/XLA doesn't support sdpa yet. So we need to set the default attention implementation to eager. 2. tensor.item() will trigger tpu graph synchronization. We should avoid using it in the training loop. ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29158/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29158/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29158", "html_url": "https://github.com/huggingface/transformers/pull/29158", "diff_url": "https://github.com/huggingface/transformers/pull/29158.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29158.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29157
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29157/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29157/comments
https://api.github.com/repos/huggingface/transformers/issues/29157/events
https://github.com/huggingface/transformers/issues/29157
2,145,549,903
I_kwDOCUB6oc5_4n5P
29,157
Error while saving with EarlyStoppingCallback
{ "login": "dhruvmullick", "id": 7004024, "node_id": "MDQ6VXNlcjcwMDQwMjQ=", "avatar_url": "https://avatars.githubusercontent.com/u/7004024?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhruvmullick", "html_url": "https://github.com/dhruvmullick", "followers_url": "https://api.github.com...
[]
open
false
null
[]
[]
1,708
1,708
null
NONE
null
### System Info - `transformers` version: 4.38.0.dev0 - Platform: Linux-5.15.0-78-generic-x86_64-with-glibc2.35 - Python version: 3.10.12 - Huggingface_hub version: 0.20.3 - Safetensors version: 0.4.2 - Accelerate version: 0.28.0.dev0 - Accelerate config: not found - PyTorch version (GPU?): 2.1.2+cu121 (True...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29157/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29157/timeline
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/29156
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29156/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29156/comments
https://api.github.com/repos/huggingface/transformers/issues/29156/events
https://github.com/huggingface/transformers/pull/29156
2,145,522,407
PR_kwDOCUB6oc5ndO3J
29,156
Making extensible
{ "login": "ddevaul", "id": 71190628, "node_id": "MDQ6VXNlcjcxMTkwNjI4", "avatar_url": "https://avatars.githubusercontent.com/u/71190628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ddevaul", "html_url": "https://github.com/ddevaul", "followers_url": "https://api.github.com/users/ddevau...
[]
open
false
null
[]
[ "Hi @ddevaul, what is the purpose of this PR? \r\n" ]
1,708
1,708
null
NONE
null
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29156/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29156/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29156", "html_url": "https://github.com/huggingface/transformers/pull/29156", "diff_url": "https://github.com/huggingface/transformers/pull/29156.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29156.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29155
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29155/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29155/comments
https://api.github.com/repos/huggingface/transformers/issues/29155/events
https://github.com/huggingface/transformers/issues/29155
2,145,382,760
I_kwDOCUB6oc5_3_Fo
29,155
PyTest import error
{ "login": "loadams", "id": 114770087, "node_id": "U_kgDOBtdApw", "avatar_url": "https://avatars.githubusercontent.com/u/114770087?v=4", "gravatar_id": "", "url": "https://api.github.com/users/loadams", "html_url": "https://github.com/loadams", "followers_url": "https://api.github.com/users/loadams/foll...
[]
open
false
null
[]
[]
1,708
1,708
null
NONE
null
### System Info Current head of transformers shows this issue, when importing functions from pytest, the `import_path` function is not found. Sample error from DeepSpeed's unit tests [here](https://github.com/microsoft/DeepSpeed/actions/runs/7977730884/job/21781270161?pr=5164#step:7:391). ``` ______________ ERROR...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29155/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29155/timeline
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/29154
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29154/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29154/comments
https://api.github.com/repos/huggingface/transformers/issues/29154/events
https://github.com/huggingface/transformers/pull/29154
2,145,294,779
PR_kwDOCUB6oc5nccpR
29,154
Update pytest `import_path` location
{ "login": "loadams", "id": 114770087, "node_id": "U_kgDOBtdApw", "avatar_url": "https://avatars.githubusercontent.com/u/114770087?v=4", "gravatar_id": "", "url": "https://api.github.com/users/loadams", "html_url": "https://github.com/loadams", "followers_url": "https://api.github.com/users/loadams/foll...
[]
open
false
null
[]
[ "The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29154). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update." ]
1,708
1,708
null
NONE
null
# What does this PR do? Fixes location of `import_path` from pytest from `_pytest.doctest` to `_pytest.pathlib` when using PyTest 8.0.1+ since it is finally deprecated from being in `_pytest.doctest`. It is provided in `_pytest.pathlib` from at least 7.2.0+ so we do not need to modify the supported pytest range in ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29154/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29154/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29154", "html_url": "https://github.com/huggingface/transformers/pull/29154", "diff_url": "https://github.com/huggingface/transformers/pull/29154.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29154.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29153
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29153/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29153/comments
https://api.github.com/repos/huggingface/transformers/issues/29153/events
https://github.com/huggingface/transformers/issues/29153
2,145,101,851
I_kwDOCUB6oc5_26gb
29,153
Plans to add DoRA?
{ "login": "RonanKMcGovern", "id": 78278410, "node_id": "MDQ6VXNlcjc4Mjc4NDEw", "avatar_url": "https://avatars.githubusercontent.com/u/78278410?v=4", "gravatar_id": "", "url": "https://api.github.com/users/RonanKMcGovern", "html_url": "https://github.com/RonanKMcGovern", "followers_url": "https://api.gi...
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
[ "cc @younesbelkada @pacman100 ", "Hi @RonanKMcGovern ! \r\nThanks for the feature request! There is already an ongoing work from @BenjaminBossan to add DoRA in PEFT: https://github.com/huggingface/peft/pull/1474", "Closing as there is a PR underway.", "OK thank you @RonanKMcGovern !" ]
1,708
1,708
null
NONE
null
### Feature request Improves on LoRA by allowing magnitude fine-tuning. ### Motivation Improved perplexity. ### Your contribution Sebastien Bubeck has published demo code. https://github.com/rasbt/dora-from-scratch
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29153/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29153/timeline
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/29152
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29152/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29152/comments
https://api.github.com/repos/huggingface/transformers/issues/29152/events
https://github.com/huggingface/transformers/pull/29152
2,145,071,699
PR_kwDOCUB6oc5nbr5K
29,152
Alternative approach
{ "login": "amyeroberts", "id": 22614925, "node_id": "MDQ6VXNlcjIyNjE0OTI1", "avatar_url": "https://avatars.githubusercontent.com/u/22614925?v=4", "gravatar_id": "", "url": "https://api.github.com/users/amyeroberts", "html_url": "https://github.com/amyeroberts", "followers_url": "https://api.github.com/...
[]
open
false
null
[]
[ "cc @Rocketknight1 ", "The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29152). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update." ]
1,708
1,708
null
COLLABORATOR
null
# What does this PR do? Alternative way to use stop words for generated sequences. Note - it doesn't <details> <summary>Script</summary> ```py import time import numpy as np from transformers.generation.stopping_criteria import StopStringCriteria, StopStringCriteria2 from transformers import AutoToke...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29152/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29152/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29152", "html_url": "https://github.com/huggingface/transformers/pull/29152", "diff_url": "https://github.com/huggingface/transformers/pull/29152.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29152.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29151
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29151/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29151/comments
https://api.github.com/repos/huggingface/transformers/issues/29151/events
https://github.com/huggingface/transformers/issues/29151
2,145,069,207
I_kwDOCUB6oc5_2yiX
29,151
Static cache + torch.compile: support prefill static sequence length
{ "login": "fxmarty", "id": 9808326, "node_id": "MDQ6VXNlcjk4MDgzMjY=", "avatar_url": "https://avatars.githubusercontent.com/u/9808326?v=4", "gravatar_id": "", "url": "https://api.github.com/users/fxmarty", "html_url": "https://github.com/fxmarty", "followers_url": "https://api.github.com/users/fxmarty/...
[]
open
false
null
[]
[ "cc @gante ", "@fxmarty this is the same problem as we have in TF and Flax. There, we nudged users to use the `pad_to_multiple_of` argument in the tokenizer, which I believe solves the problem 🤗 \r\n\r\nHow do you suggest us to let users know about this feature, other than docs?" ]
1,708
1,708
null
COLLABORATOR
null
### Feature request When using torch.compile, the prefill is recompiled for every new sequence length, which is slow. It may be nice to be able to compile only say for some sequence lengths (`1, 2, 4, 16, 32, 64, 128, 256, 512, 1024, 2048, 4096, etc`) on the fly depending on the input lengths, using some padding. ###...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29151/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29151/timeline
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/29150
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29150/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29150/comments
https://api.github.com/repos/huggingface/transformers/issues/29150/events
https://github.com/huggingface/transformers/issues/29150
2,144,941,834
I_kwDOCUB6oc5_2TcK
29,150
Difficulty in adding custom model
{ "login": "El-chapo-007", "id": 125077963, "node_id": "U_kgDOB3SJyw", "avatar_url": "https://avatars.githubusercontent.com/u/125077963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/El-chapo-007", "html_url": "https://github.com/El-chapo-007", "followers_url": "https://api.github.com/use...
[]
open
false
null
[]
[ "Hi @El-chapo-007, thanks for opening this issue! \r\n\r\nGlad to hear that your journey has been mostly successful 🤗 \r\n\r\nHave you seen our documentation page about adding custom models? This should contain all the info and example code needed to get started: https://huggingface.co/docs/transformers/custom_mod...
1,708
1,708
null
NONE
null
### Feature request Hi Hope all the team members of hugging face are well I am a student and currently doing work on nlp projects , although most of my journey was successful because well documented information for starters especially example notebooks but what part is confusing and difficult is to upload and cr...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29150/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29150/timeline
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/29149
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29149/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29149/comments
https://api.github.com/repos/huggingface/transformers/issues/29149/events
https://github.com/huggingface/transformers/issues/29149
2,144,914,235
I_kwDOCUB6oc5_2Ms7
29,149
Generate: support passing position_ids
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/follow...
[]
open
false
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/follow...
[ { "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.co...
[ "@zucchini-nlp FYI. We shouldn't fix this now, as it requires significant manual labor to update all models. After the static cache sprint we should have a look at this :)" ]
1,708
1,708
null
MEMBER
null
Thank you @tengomucho, for uncovering this bug. ### The problem In a nutshell, passing the correct `position_ids` to `generate` should result in exactly the same results as not passing them. In other words, the following test should pass on all models, if added to `GenerationTesterMixin`. We can see that it is fa...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29149/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29149/timeline
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/29148
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29148/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29148/comments
https://api.github.com/repos/huggingface/transformers/issues/29148/events
https://github.com/huggingface/transformers/pull/29148
2,144,911,415
PR_kwDOCUB6oc5nbILV
29,148
Token level timestamps for long-form generation in Whisper
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
[]
open
false
null
[]
[ "The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29148). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update." ]
1,708
1,708
null
MEMBER
null
# What does this PR do? Continuation of PR #28984. Adds token level timestamps for long-form generation. The previous PR had a quite different of way to add timestamps, specifically by calling `extract_timestamps` for each segment and each batch separately. I believe, it can be done in one batch, and then divided in...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29148/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29148/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29148", "html_url": "https://github.com/huggingface/transformers/pull/29148", "diff_url": "https://github.com/huggingface/transformers/pull/29148.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29148.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29147
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29147/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29147/comments
https://api.github.com/repos/huggingface/transformers/issues/29147/events
https://github.com/huggingface/transformers/pull/29147
2,144,785,389
PR_kwDOCUB6oc5nasd-
29,147
Fix drop path being ignored in DINOv2
{ "login": "fepegar", "id": 12688084, "node_id": "MDQ6VXNlcjEyNjg4MDg0", "avatar_url": "https://avatars.githubusercontent.com/u/12688084?v=4", "gravatar_id": "", "url": "https://api.github.com/users/fepegar", "html_url": "https://github.com/fepegar", "followers_url": "https://api.github.com/users/fepega...
[]
closed
false
null
[]
[ "Thanks for reviewing, @amyeroberts!" ]
1,708
1,708
1,708
CONTRIBUTOR
null
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29147/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29147/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29147", "html_url": "https://github.com/huggingface/transformers/pull/29147", "diff_url": "https://github.com/huggingface/transformers/pull/29147.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29147.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29146
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29146/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29146/comments
https://api.github.com/repos/huggingface/transformers/issues/29146/events
https://github.com/huggingface/transformers/pull/29146
2,144,586,510
PR_kwDOCUB6oc5naAbp
29,146
Generate: missing generation config eos token setting in encoder-decoder tests
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/follow...
[]
closed
false
null
[]
[ "The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29146). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update." ]
1,708
1,708
1,708
MEMBER
null
# What does this PR do? These tests were failing with low likelihood, all for the same reason as fixed in [this recent PR](https://github.com/huggingface/transformers/pull/28923): there should be no EOS token to enable endless generation, but the generation config still had the default value. I couldn't find more...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29146/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29146/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29146", "html_url": "https://github.com/huggingface/transformers/pull/29146", "diff_url": "https://github.com/huggingface/transformers/pull/29146.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29146.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29145
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29145/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29145/comments
https://api.github.com/repos/huggingface/transformers/issues/29145/events
https://github.com/huggingface/transformers/issues/29145
2,144,556,865
I_kwDOCUB6oc5_01dB
29,145
AI2 Olmo 7B does not support Flash-Attention 2.0. ValueError: OLMoForCausalLM does not support Flash Attention 2.0 yet.
{ "login": "KaifAhmad1", "id": 98801504, "node_id": "U_kgDOBeOXYA", "avatar_url": "https://avatars.githubusercontent.com/u/98801504?v=4", "gravatar_id": "", "url": "https://api.github.com/users/KaifAhmad1", "html_url": "https://github.com/KaifAhmad1", "followers_url": "https://api.github.com/users/KaifA...
[ { "id": 1843244711, "node_id": "MDU6TGFiZWwxODQzMjQ0NzEx", "url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model", "name": "New model", "color": "fbca04", "default": false, "description": "" } ]
closed
false
null
[]
[]
1,708
1,708
1,708
NONE
null
### Model description Model Name: allenai/OLMo-7B ### Open source status - [X] The model implementation is available - [X] The model weights are available ### Provide useful links for the implementation _No response_
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29145/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29145/timeline
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/29144
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29144/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29144/comments
https://api.github.com/repos/huggingface/transformers/issues/29144/events
https://github.com/huggingface/transformers/pull/29144
2,144,483,260
PR_kwDOCUB6oc5nZpun
29,144
bug-fix: avoid 'Expected all tensors to be on the same device' error when doing multi-GPU training
{ "login": "kallewoof", "id": 250224, "node_id": "MDQ6VXNlcjI1MDIyNA==", "avatar_url": "https://avatars.githubusercontent.com/u/250224?v=4", "gravatar_id": "", "url": "https://api.github.com/users/kallewoof", "html_url": "https://github.com/kallewoof", "followers_url": "https://api.github.com/users/kall...
[]
open
false
null
[]
[]
1,708
1,708
null
NONE
null
When doing DPO training, if the model has been split over multiple GPUs, the `tr_loss` and the `tr_loss_step` end up on different devices at some point, resulting in a ``` Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cuda:1 ``` error. This patch makes an explicit copy...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29144/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29144/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29144", "html_url": "https://github.com/huggingface/transformers/pull/29144", "diff_url": "https://github.com/huggingface/transformers/pull/29144.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29144.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29143
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29143/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29143/comments
https://api.github.com/repos/huggingface/transformers/issues/29143/events
https://github.com/huggingface/transformers/pull/29143
2,144,476,455
PR_kwDOCUB6oc5nZoPN
29,143
Llama: update rope scaling to match static cache changes
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/follow...
[]
open
false
null
[]
[ "The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29143). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update." ]
1,708
1,708
null
MEMBER
null
# What does this PR do? (see title :)) Review suggestion: 1. Review changes in Llama 2. Review the rest
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29143/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29143/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29143", "html_url": "https://github.com/huggingface/transformers/pull/29143", "diff_url": "https://github.com/huggingface/transformers/pull/29143.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29143.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29142
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29142/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29142/comments
https://api.github.com/repos/huggingface/transformers/issues/29142/events
https://github.com/huggingface/transformers/pull/29142
2,144,430,707
PR_kwDOCUB6oc5nZeOR
29,142
Add training version check for AQLM quantizer.
{ "login": "BlackSamorez", "id": 16901341, "node_id": "MDQ6VXNlcjE2OTAxMzQx", "avatar_url": "https://avatars.githubusercontent.com/u/16901341?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BlackSamorez", "html_url": "https://github.com/BlackSamorez", "followers_url": "https://api.github.c...
[]
open
false
null
[]
[ "The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29142). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update." ]
1,708
1,708
null
CONTRIBUTOR
null
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29142/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29142/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29142", "html_url": "https://github.com/huggingface/transformers/pull/29142", "diff_url": "https://github.com/huggingface/transformers/pull/29142.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29142.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29141
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29141/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29141/comments
https://api.github.com/repos/huggingface/transformers/issues/29141/events
https://github.com/huggingface/transformers/pull/29141
2,144,232,619
PR_kwDOCUB6oc5nYyzq
29,141
Save (circleci) cache at the end of a job
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
[]
closed
false
null
[]
[ "The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29141). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update." ]
1,708
1,708
1,708
COLLABORATOR
null
# What does this PR do? This way, `pytest` will run before `cache saving` and we have access to the results earlier in the case of partial or no cache loaded.
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29141/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29141/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29141", "html_url": "https://github.com/huggingface/transformers/pull/29141", "diff_url": "https://github.com/huggingface/transformers/pull/29141.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29141.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29140
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29140/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29140/comments
https://api.github.com/repos/huggingface/transformers/issues/29140/events
https://github.com/huggingface/transformers/issues/29140
2,144,160,231
I_kwDOCUB6oc5_zUnn
29,140
Drop path is ignored in DINOv2
{ "login": "fepegar", "id": 12688084, "node_id": "MDQ6VXNlcjEyNjg4MDg0", "avatar_url": "https://avatars.githubusercontent.com/u/12688084?v=4", "gravatar_id": "", "url": "https://api.github.com/users/fepegar", "html_url": "https://github.com/fepegar", "followers_url": "https://api.github.com/users/fepega...
[]
closed
false
null
[]
[ "Hey, thanks for the issue! I've checked out your branch, from what I'm seeing tests are passing on your fix, would you mind opening a PR? \r\nAlso, since this will affect training, do you have a script that compares both in a training scenario? AFAIK current integration tests for Dinov2 are not in a training setti...
1,708
1,708
1,708
CONTRIBUTOR
null
### System Info - `transformers` version: 4.38.0.dev0 - Platform: Linux-5.15.0-91-generic-x86_64-with-glibc2.31 - Python version: 3.11.7 - Huggingface_hub version: 0.20.3 - Safetensors version: 0.4.2 - Accelerate version: not installed - Accelerate config: not found - PyTorch version (GPU?): 2.2.0 (True) - Ten...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29140/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29140/timeline
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/29139
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29139/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29139/comments
https://api.github.com/repos/huggingface/transformers/issues/29139/events
https://github.com/huggingface/transformers/issues/29139
2,144,132,992
I_kwDOCUB6oc5_zN-A
29,139
past_key_values for SeamlessM4Tv2ForSpeechToText is not working as expected
{ "login": "vapemaster-kz", "id": 65128133, "node_id": "MDQ6VXNlcjY1MTI4MTMz", "avatar_url": "https://avatars.githubusercontent.com/u/65128133?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vapemaster-kz", "html_url": "https://github.com/vapemaster-kz", "followers_url": "https://api.githu...
[]
open
false
null
[]
[ "cc @ylacombe " ]
1,708
1,708
null
NONE
null
### System Info transformers version: 4.37.2 python verison: 3.8.6. OS: Windows 11 ### Who can help? @sanchit-gandhi ### Information - [ ] The official example scripts - [X] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [X] My own task...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29139/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29139/timeline
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/29138
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29138/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29138/comments
https://api.github.com/repos/huggingface/transformers/issues/29138/events
https://github.com/huggingface/transformers/pull/29138
2,144,115,768
PR_kwDOCUB6oc5nYZN3
29,138
Fix ROPE embeddings for LLama
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
[]
1,708
1,708
1,708
MEMBER
null
# What does this PR do? This [test](https://app.circleci.com/pipelines/github/huggingface/transformers/84847/workflows/2a5e5769-9431-4e2b-babb-81a112558a97/jobs/1098065) failed on my PR and I checked to see the reason. I found that the changes introduced to make llama compile compatible are causing the issue. Th...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29138/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29138/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29138", "html_url": "https://github.com/huggingface/transformers/pull/29138", "diff_url": "https://github.com/huggingface/transformers/pull/29138.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29138.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29137
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29137/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29137/comments
https://api.github.com/repos/huggingface/transformers/issues/29137/events
https://github.com/huggingface/transformers/issues/29137
2,144,069,859
I_kwDOCUB6oc5_y-jj
29,137
transformers.AutoTokenizer.from_pretrained( ... use_Fast=False) fails with 'TypeError: not a string' for some tokenizers
{ "login": "Jeronymous", "id": 22522728, "node_id": "MDQ6VXNlcjIyNTIyNzI4", "avatar_url": "https://avatars.githubusercontent.com/u/22522728?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Jeronymous", "html_url": "https://github.com/Jeronymous", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
[ "cc @ArthurZucker ", "Hey! Thanks for reporting. \r\n`tokenizer.Load(self.vocab_file)` seems to be the issue here. If you check the repo it does not have the `tokenizer.model` .\r\nYou should raise the issue there! \r\n", "Thanks @ArthurZucker 👍 " ]
1,708
1,708
1,708
NONE
null
### System Info - `transformers` version: 4.37.2 - Platform: Linux-5.15.133.1-microsoft-standard-WSL2-x86_64-with-glibc2.35 - Python version: 3.10.12 - Huggingface_hub version: 0.19.4 - Safetensors version: 0.4.1 - Accelerate version: not installed - Accelerate config: not found - PyTorch version (GPU?...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29137/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29137/timeline
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/29136
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29136/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29136/comments
https://api.github.com/repos/huggingface/transformers/issues/29136/events
https://github.com/huggingface/transformers/pull/29136
2,144,048,828
PR_kwDOCUB6oc5nYKjd
29,136
Generate: low memory tests are flaky
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/follow...
[]
open
false
null
[]
[ "The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29136). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.", "@amyeroberts #29109 seems to have fixed most of the issue (this test does compare ...
1,708
1,708
null
MEMBER
null
# What does this PR do? As identified by @molbap -- generate tests with the `low_memory` flag are flaky. The full reason is the same as explained in [this comment](https://github.com/huggingface/transformers/issues/25420#issuecomment-1775317535). The error likelihood has low (~3%), but still quite disruptive for ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29136/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29136/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29136", "html_url": "https://github.com/huggingface/transformers/pull/29136", "diff_url": "https://github.com/huggingface/transformers/pull/29136.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29136.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29135
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29135/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29135/comments
https://api.github.com/repos/huggingface/transformers/issues/29135/events
https://github.com/huggingface/transformers/pull/29135
2,144,037,386
PR_kwDOCUB6oc5nYICS
29,135
Revert low cpu mem tie weights
{ "login": "amyeroberts", "id": 22614925, "node_id": "MDQ6VXNlcjIyNjE0OTI1", "avatar_url": "https://avatars.githubusercontent.com/u/22614925?v=4", "gravatar_id": "", "url": "https://api.github.com/users/amyeroberts", "html_url": "https://github.com/amyeroberts", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
[ "The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29135). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.", "Sounds good, thanks for taking care of this!" ]
1,708
1,708
1,708
COLLABORATOR
null
# What does this PR do? Reverts #28948 and #29043 See relevant comment: https://github.com/huggingface/transformers/pull/29110#issuecomment-1953847826 cc @hackyon @ydshieh
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29135/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29135/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29135", "html_url": "https://github.com/huggingface/transformers/pull/29135", "diff_url": "https://github.com/huggingface/transformers/pull/29135.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29135.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29134
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29134/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29134/comments
https://api.github.com/repos/huggingface/transformers/issues/29134/events
https://github.com/huggingface/transformers/pull/29134
2,143,960,967
PR_kwDOCUB6oc5nX3V4
29,134
Add generate kwargs to VQA pipeline
{ "login": "regisss", "id": 15324346, "node_id": "MDQ6VXNlcjE1MzI0MzQ2", "avatar_url": "https://avatars.githubusercontent.com/u/15324346?v=4", "gravatar_id": "", "url": "https://api.github.com/users/regisss", "html_url": "https://github.com/regisss", "followers_url": "https://api.github.com/users/regiss...
[]
open
false
null
[]
[ "The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29134). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update." ]
1,708
1,708
null
CONTRIBUTOR
null
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29134/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29134/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29134", "html_url": "https://github.com/huggingface/transformers/pull/29134", "diff_url": "https://github.com/huggingface/transformers/pull/29134.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29134.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29133
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29133/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29133/comments
https://api.github.com/repos/huggingface/transformers/issues/29133/events
https://github.com/huggingface/transformers/pull/29133
2,143,951,741
PR_kwDOCUB6oc5nX1Va
29,133
[`cuda kernels`] only compile them when initializing
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.c...
[]
closed
false
null
[]
[ "The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29133). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.", "I'll make sure of that before merging! Testing now!", "```bash\r\nFAILED tests/m...
1,708
1,708
1,708
COLLABORATOR
null
# What does this PR do? Fixes #29130, from 1min to 6seconds
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29133/reactions", "total_count": 3, "+1": 0, "-1": 0, "laugh": 0, "hooray": 3, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29133/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29133", "html_url": "https://github.com/huggingface/transformers/pull/29133", "diff_url": "https://github.com/huggingface/transformers/pull/29133.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29133.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29132
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29132/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29132/comments
https://api.github.com/repos/huggingface/transformers/issues/29132/events
https://github.com/huggingface/transformers/issues/29132
2,143,872,350
I_kwDOCUB6oc5_yOVe
29,132
SPAM
{ "login": "cook9019", "id": 141466977, "node_id": "U_kgDOCG6dYQ", "avatar_url": "https://avatars.githubusercontent.com/u/141466977?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cook9019", "html_url": "https://github.com/cook9019", "followers_url": "https://api.github.com/users/cook9019/...
[]
closed
false
null
[]
[]
1,708
1,708
1,708
NONE
null
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29132/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29132/timeline
not_planned
null
null
https://api.github.com/repos/huggingface/transformers/issues/29131
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29131/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29131/comments
https://api.github.com/repos/huggingface/transformers/issues/29131/events
https://github.com/huggingface/transformers/pull/29131
2,143,812,725
PR_kwDOCUB6oc5nXWfA
29,131
added the max_matching_ngram_size to GenerationConfig
{ "login": "mosheber", "id": 22236370, "node_id": "MDQ6VXNlcjIyMjM2Mzcw", "avatar_url": "https://avatars.githubusercontent.com/u/22236370?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mosheber", "html_url": "https://github.com/mosheber", "followers_url": "https://api.github.com/users/mos...
[]
open
false
null
[]
[]
1,708
1,708
null
CONTRIBUTOR
null
# What does this PR do? * Added the max_matching_ngram_size parameter into the GenerationConfig, for the PromptLookupCandidateGenerator. * Included the max_matching_ngram_size when calling the __init__ of PromptLookupCandidateGenerator in _get_candidate_generator, in case it is specified. ## Who can review? ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29131/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29131/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29131", "html_url": "https://github.com/huggingface/transformers/pull/29131", "diff_url": "https://github.com/huggingface/transformers/pull/29131.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29131.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29130
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29130/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29130/comments
https://api.github.com/repos/huggingface/transformers/issues/29130/events
https://github.com/huggingface/transformers/issues/29130
2,143,788,296
I_kwDOCUB6oc5_x50I
29,130
Move kernel compilation to init rather than at import stage
{ "login": "NielsRogge", "id": 48327001, "node_id": "MDQ6VXNlcjQ4MzI3MDAx", "avatar_url": "https://avatars.githubusercontent.com/u/48327001?v=4", "gravatar_id": "", "url": "https://api.github.com/users/NielsRogge", "html_url": "https://github.com/NielsRogge", "followers_url": "https://api.github.com/use...
[ { "id": 1862634478, "node_id": "MDU6TGFiZWwxODYyNjM0NDc4", "url": "https://api.github.com/repos/huggingface/transformers/labels/Should%20Fix", "name": "Should Fix", "color": "FF0000", "default": false, "description": "This has been identified as a bug and should be fixed." }, { "...
closed
false
null
[]
[]
1,708
1,708
1,708
CONTRIBUTOR
null
### Feature request Some models like Deformable DETR rely on custom CUDA kernels to be compiled as seen [here](https://github.com/huggingface/transformers/blob/f7ef7cec6c6c162087421f36a17eabdbb223579d/src/transformers/models/deformable_detr/modeling_deformable_detr.py#L54). Currently these are compiled when importi...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29130/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29130/timeline
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/29129
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29129/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29129/comments
https://api.github.com/repos/huggingface/transformers/issues/29129/events
https://github.com/huggingface/transformers/issues/29129
2,143,773,084
I_kwDOCUB6oc5_x2Gc
29,129
Flash attention implementation with BERT base model
{ "login": "ghost", "id": 10137, "node_id": "MDQ6VXNlcjEwMTM3", "avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ghost", "html_url": "https://github.com/ghost", "followers_url": "https://api.github.com/users/ghost/followers", "f...
[ { "id": 1843244711, "node_id": "MDU6TGFiZWwxODQzMjQ0NzEx", "url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model", "name": "New model", "color": "fbca04", "default": false, "description": "" } ]
open
false
null
[]
[ "Not that expert but I suggest you can try bettertransformer for extreme speed up. ( In my knowledge that flash-attn is mainly focused on kv cache which is not exist on Bert-like model in most cases. )", "> Not that expert but I suggest you can try bettertransformer for extreme speed up. ( In my knowledge that fl...
1,708
1,708
null
NONE
null
### Model description hello and thanks community. I am trying to replace standard attention by flash attention in the BERT base Model. Anyone please help not able to find any tutorial or any discussions. or just give some directions how to do that ..I have got the idea of making attention prob drop prob = 0 . it m...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29129/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29129/timeline
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/29128
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29128/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29128/comments
https://api.github.com/repos/huggingface/transformers/issues/29128/events
https://github.com/huggingface/transformers/issues/29128
2,143,692,799
I_kwDOCUB6oc5_xif_
29,128
bart-large-xsum model: There were missing keys in the checkpoint model loaded: ['model.encoder.embed_tokens.weight', 'model.decoder.embed_tokens.weight', 'lm_head.weight'].
{ "login": "Aisuko", "id": 8053949, "node_id": "MDQ6VXNlcjgwNTM5NDk=", "avatar_url": "https://avatars.githubusercontent.com/u/8053949?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Aisuko", "html_url": "https://github.com/Aisuko", "followers_url": "https://api.github.com/users/Aisuko/foll...
[]
open
false
null
[]
[ "cc @ArthurZucker @younesbelkada ", "Hey @Aisuko, could you provide a **minimal** reproducer ? That would help use! \r\nAlso note that the `generation parameters` issues can probably be safely ignored. The missing keys is however a bit more problematic! \r\nMight be tied weights that are not tied properly, is `ti...
1,708
1,708
null
NONE
null
### System Info - `transformers` version: 4.37.2 - Platform: Linux-5.15.133+-x86_64-with-glibc2.31 - Python version: 3.10.13 - Huggingface_hub version: 0.20.3 - Safetensors version: 0.4.2 - Accelerate version: 0.26.1 - Accelerate config: not found - PyTorch version (GPU?): 2.1.2 (True) - Tensorflow version ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29128/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29128/timeline
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/29127
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29127/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29127/comments
https://api.github.com/repos/huggingface/transformers/issues/29127/events
https://github.com/huggingface/transformers/issues/29127
2,143,620,996
I_kwDOCUB6oc5_xQ-E
29,127
err_handle(layoutlmv3): Error message doesn't give much clarity when boxes not containing enough information
{ "login": "Sushaanth-Suresh-Kumar", "id": 123300765, "node_id": "U_kgDOB1lrnQ", "avatar_url": "https://avatars.githubusercontent.com/u/123300765?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Sushaanth-Suresh-Kumar", "html_url": "https://github.com/Sushaanth-Suresh-Kumar", "followers_url...
[]
open
false
null
[]
[ "Would you like to open a PR to improve the error? 🤗 ", "Sure" ]
1,708
1,708
null
NONE
null
### System Info - `transformers` version: 4.37.2 - Platform: Windows-10-10.0.22000-SP0 - Python version: 3.11.5 - Huggingface_hub version: 0.20.3 - Safetensors version: 0.4.2 - Accelerate version: not installed - Accelerate config: not found - PyTorch version (GPU?): 2.2.0+cpu (False) - Tensorflow version (G...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29127/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29127/timeline
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/29126
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29126/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29126/comments
https://api.github.com/repos/huggingface/transformers/issues/29126/events
https://github.com/huggingface/transformers/issues/29126
2,143,539,045
I_kwDOCUB6oc5_w89l
29,126
WARNING: tokenization mismatch: 43 vs. 44. (ignored)
{ "login": "lucasjinreal", "id": 21303438, "node_id": "MDQ6VXNlcjIxMzAzNDM4", "avatar_url": "https://avatars.githubusercontent.com/u/21303438?v=4", "gravatar_id": "", "url": "https://api.github.com/users/lucasjinreal", "html_url": "https://github.com/lucasjinreal", "followers_url": "https://api.github.c...
[]
open
false
null
[]
[ "Hi @lucasjinreal, \r\n\r\nWithout a code sample to replicate, information about the running environment or more information about the error - including full trackback - there isn't much we can do to help you here." ]
1,708
1,708
null
NONE
null
Recently there are many errors got either from fastchat or llava code base if using latest transfomers. WARNING: tokenization mismatch: 43 vs. 44. (ignored) What does this happen and how to dismiss it? Will it effect the final training result?
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29126/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29126/timeline
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/29125
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29125/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29125/comments
https://api.github.com/repos/huggingface/transformers/issues/29125/events
https://github.com/huggingface/transformers/pull/29125
2,143,504,797
PR_kwDOCUB6oc5nWUBE
29,125
feat: Upgrade Weights & Biases callback
{ "login": "parambharat", "id": 12809212, "node_id": "MDQ6VXNlcjEyODA5MjEy", "avatar_url": "https://avatars.githubusercontent.com/u/12809212?v=4", "gravatar_id": "", "url": "https://api.github.com/users/parambharat", "html_url": "https://github.com/parambharat", "followers_url": "https://api.github.com/...
[]
open
false
null
[]
[]
1,708
1,708
null
CONTRIBUTOR
null
# What does this PR do? This PR adds a few new functionalities to the Weights & Biases Callback - Logs Peft and Lora Config to wandb if present - Adds model parameter counts to wandb config and artifact metadata - Adds on_predict methods to log prediction metrics - Prints the model architecture to a file alongsi...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29125/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29125/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29125", "html_url": "https://github.com/huggingface/transformers/pull/29125", "diff_url": "https://github.com/huggingface/transformers/pull/29125.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29125.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29124
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29124/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29124/comments
https://api.github.com/repos/huggingface/transformers/issues/29124/events
https://github.com/huggingface/transformers/pull/29124
2,143,420,111
PR_kwDOCUB6oc5nWBoW
29,124
added unrolled whisper_generation.py
{ "login": "robertgshaw2-neuralmagic", "id": 114415538, "node_id": "U_kgDOBtHXsg", "avatar_url": "https://avatars.githubusercontent.com/u/114415538?v=4", "gravatar_id": "", "url": "https://api.github.com/users/robertgshaw2-neuralmagic", "html_url": "https://github.com/robertgshaw2-neuralmagic", "followe...
[]
closed
false
null
[]
[]
1,708
1,708
1,708
NONE
null
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29124/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29124/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29124", "html_url": "https://github.com/huggingface/transformers/pull/29124", "diff_url": "https://github.com/huggingface/transformers/pull/29124.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29124.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29123
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29123/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29123/comments
https://api.github.com/repos/huggingface/transformers/issues/29123/events
https://github.com/huggingface/transformers/pull/29123
2,143,416,822
PR_kwDOCUB6oc5nWA8d
29,123
[`Core generation`] Let's be less restrictive on the arguments passed to the generation calls.
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.c...
[]
open
false
null
[]
[ "The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29123). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update." ]
1,708
1,708
null
COLLABORATOR
null
# What does this PR do? Updates generate calls
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29123/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29123/timeline
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29123", "html_url": "https://github.com/huggingface/transformers/pull/29123", "diff_url": "https://github.com/huggingface/transformers/pull/29123.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29123.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29122
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29122/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29122/comments
https://api.github.com/repos/huggingface/transformers/issues/29122/events
https://github.com/huggingface/transformers/pull/29122
2,143,413,555
PR_kwDOCUB6oc5nWARN
29,122
FIX [`bnb` / `tests`] Propagate the changes from #29092 to 4-bit tests
{ "login": "younesbelkada", "id": 49240599, "node_id": "MDQ6VXNlcjQ5MjQwNTk5", "avatar_url": "https://avatars.githubusercontent.com/u/49240599?v=4", "gravatar_id": "", "url": "https://api.github.com/users/younesbelkada", "html_url": "https://github.com/younesbelkada", "followers_url": "https://api.githu...
[]
closed
false
null
[]
[ "The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29122). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update." ]
1,708
1,708
1,708
CONTRIBUTOR
null
# What does this PR do? As per title, I overlooked the fix and forgot to push the changes of https://github.com/huggingface/transformers/pull/29092 in 4-bit tests 😢 cc @amyeroberts @Titus-von-Koeller
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29122/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29122/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29122", "html_url": "https://github.com/huggingface/transformers/pull/29122", "diff_url": "https://github.com/huggingface/transformers/pull/29122.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29122.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29121
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29121/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29121/comments
https://api.github.com/repos/huggingface/transformers/issues/29121/events
https://github.com/huggingface/transformers/issues/29121
2,143,187,142
I_kwDOCUB6oc5_vnDG
29,121
AttributeError: 'DistilBertModel' object has no attribute '_use_flash_attention_2'
{ "login": "javilonso", "id": 31996659, "node_id": "MDQ6VXNlcjMxOTk2NjU5", "avatar_url": "https://avatars.githubusercontent.com/u/31996659?v=4", "gravatar_id": "", "url": "https://api.github.com/users/javilonso", "html_url": "https://github.com/javilonso", "followers_url": "https://api.github.com/users/...
[]
open
false
null
[]
[ "Hi @javilonso ! \r\nI quickly tried on transformers main: \r\n```python\r\nfrom transformers import pipeline\r\n\r\nunmasker = pipeline('fill-mask', model='distilbert-base-uncased')\r\nunmasker(\"Hello I'm a [MASK] model.\")\r\n```\r\nBut I did not managed to repro, can you share a snippet to reproduce the issue?\...
1,708
1,708
null
NONE
null
### System Info Obtaining this error in last transformers 4.37.2, but works correctly in transformers 4.35.2 Simple inference with a finetuned distilbert model. ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supporte...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29121/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29121/timeline
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/29120
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29120/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29120/comments
https://api.github.com/repos/huggingface/transformers/issues/29120/events
https://github.com/huggingface/transformers/pull/29120
2,143,042,742
PR_kwDOCUB6oc5nUwcG
29,120
Starcoder2 model
{ "login": "jlamypoirier", "id": 18523627, "node_id": "MDQ6VXNlcjE4NTIzNjI3", "avatar_url": "https://avatars.githubusercontent.com/u/18523627?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jlamypoirier", "html_url": "https://github.com/jlamypoirier", "followers_url": "https://api.github.c...
[]
open
false
null
[]
[]
1,708
1,708
null
CONTRIBUTOR
null
The Starcoder2 model, adapted from Mistral. All changes are done through options, so Mistral itself is still supported. Main changes: * Use layer norm (RMS still available as option) * Use standard MLP (gated still available as option) * Add back biases (optional) * Change (default?) tokenizer class *Embedding and...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29120/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29120/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29120", "html_url": "https://github.com/huggingface/transformers/pull/29120", "diff_url": "https://github.com/huggingface/transformers/pull/29120.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29120.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/29119
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/29119/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/29119/comments
https://api.github.com/repos/huggingface/transformers/issues/29119/events
https://github.com/huggingface/transformers/pull/29119
2,143,005,049
PR_kwDOCUB6oc5nUoNF
29,119
Generate: unset GenerationConfig parameters do not raise warning
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/follow...
[]
closed
false
null
[]
[ "The docs for this PR live [here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_29119). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update." ]
1,708
1,708
1,708
MEMBER
null
# What does this PR do?: Thank you @fxmarty for raising [this issue](https://github.com/huggingface/transformers/pull/25381#issuecomment-1952527813). This PR allows users to unset (= set to `None`) unused parameters to ensure `generation_config.validate()` doesn't throw a warning. Previously, this was not possibl...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/29119/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/29119/timeline
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/29119", "html_url": "https://github.com/huggingface/transformers/pull/29119", "diff_url": "https://github.com/huggingface/transformers/pull/29119.diff", "patch_url": "https://github.com/huggingface/transformers/pull/29119.patch", "merged_at...
End of preview. Expand in Data Studio
README.md exists but content is empty.
Downloads last month
18

Space using DanielPFlorian/Transformers-Github-Issues 1