Instructions to use Capreolus/electra-base-msmarco with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Capreolus/electra-base-msmarco with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="Capreolus/electra-base-msmarco")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("Capreolus/electra-base-msmarco") model = AutoModelForSequenceClassification.from_pretrained("Capreolus/electra-base-msmarco") - Notebooks
- Google Colab
- Kaggle
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
capreolus/electra-base-msmarco
Model description
ELECTRA-Base model (google/electra-base-discriminator) fine-tuned on the MS MARCO passage classification task. It is intended to be used as a ForSequenceClassification model, but requires some modification since it contains a BERT classification head rather than the standard ELECTRA classification head. See the TFElectraRelevanceHead in the Capreolus BERT-MaxP implementation for a usage example.
This corresponds to the ELECTRA-Base model used to initialize PARADE (ELECTRA) in PARADE: Passage Representation Aggregation for Document Reranking by Li et al. It was converted from the released TFv1 checkpoint. Please cite the PARADE paper if you use these weights.
- Downloads last month
- 8