Instructions to use sofom/roberta-base-hyperaram with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use sofom/roberta-base-hyperaram with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="sofom/roberta-base-hyperaram")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("sofom/roberta-base-hyperaram") model = AutoModel.from_pretrained("sofom/roberta-base-hyperaram") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 729ff87933d8b390765c697207f2e9ad3b4c79ed07fa52b41cb5123e9e96ab29
- Size of remote file:
- 249 MB
- SHA256:
- e75074490f6c5bbc8b25a736d4ae0eaf4113f875cd739b3b66a103d100a89975
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.