deit_small_patch16_224

Converted TIMM image classification model for LiteRT.

  • Source architecture: deit_small_patch16_224
  • Source checkpoint: timm/deit_small_patch16_224.fb_in1k
  • File: model.tflite
  • Input: float32 tensor in NCHW layout, shape [1, 3, 224, 224]
  • Output: ImageNet-1K logits, shape [1, 1000]

Runtime Status

  • CPU smoke test: passed with LiteRT CompiledModel.
  • GPU delegation: currently blocked for this model by rank-5 tensor patterns in the GPU backend, mostly RESHAPE, TRANSPOSE, and related window/attention operations. The model is published as CPU-ready while GPU support is being improved.

Model Details

Citation

@InProceedings{pmlr-v139-touvron21a,
  title =     {Training data-efficient image transformers & distillation through attention},
  author =    {Touvron, Hugo and Cord, Matthieu and Douze, Matthijs and Massa, Francisco and Sablayrolles, Alexandre and Jegou, Herve},
  booktitle = {International Conference on Machine Learning},
  pages =     {10347--10357},
  year =      {2021},
  volume =    {139},
  month =     {July}
}
@misc{rw2019timm,
  author = {Ross Wightman},
  title = {PyTorch Image Models},
  year = {2019},
  publisher = {GitHub},
  journal = {GitHub repository},
  doi = {10.5281/zenodo.4414861},
  howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for litert-community/deit_small_patch16_224

Finetuned
(1)
this model

Dataset used to train litert-community/deit_small_patch16_224

Collection including litert-community/deit_small_patch16_224

Paper for litert-community/deit_small_patch16_224