Instructions to use macabdul9/mrpc with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use macabdul9/mrpc with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="macabdul9/mrpc")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("macabdul9/mrpc") model = AutoModelForSequenceClassification.from_pretrained("macabdul9/mrpc") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 0a5be7370917546f8caf09178dbf1029f1dd59c11b3bf50e8e84f5c4296cc10f
- Size of remote file:
- 4.92 kB
- SHA256:
- 4c7bc6f56be3525114f68803a99486626926272ca094136daf38f734a926add8
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.