Notebooks for fine-tuning and evaluating pre-trained BERT models to tasks for semantic search in Web API documentation
-
Updated
Jul 24, 2024 - Jupyter Notebook
Notebooks for fine-tuning and evaluating pre-trained BERT models to tasks for semantic search in Web API documentation
Estudio del carácter patogénico de variantes humanas mediante deep learning. Fine-tuning de BERT para la representación de enfermedades: un método de aprendizaje sobre datasets de textos biomédicos.
Successfully established a Seq2Seq with attention model which can perform English to Spanish language translation up to an accuracy of almost 97%.
Health Progress Prediction and Goal Attainment Analysis in Patient Discharge Summaries with BERT and Tensorflow. - Feb 2022 - Jun 2023
It's Smart-Question Answering System on short as well as long documents. It can automatically find answers to matching questions directly from documents. The deep learning language model converts the questions and documents to semantic vectors to find the matching answer.
Notebooks for fine-tuning and evaluating a pre-trained BERT model to the task of semantic parameter matching in Web APIs
Using BERT models to perform sentiment analysis on women's clothing
External Knowledge Infusion using INCIDecoder into BERT for Chemical Mapping
Turkish Hate Speech Detection
A PyTorch Lightning Implementation of Multi-Language Identification using a SentenceTransformer model pre-trained on English. Work done while interning at ByteFuse.
Successfully developed a news category classification model using fine-tuned BERT which can accurately classify any news text into its respective category i.e. Politics, Business, Technology and Entertainment.
Successfully developed a resume classification model which can accurately classify the resume of any person into its corresponding job with a tremendously high accuracy of more than 99%.
Successfully developed a fine-tuned BERT transformer model which can effectively perform emotion classification on any given piece of texts to identify a suitable human emotion based on semantic meaning of the text.
WSD for Word-in-Context (WiC) disambiguation, experimenting with BERT feature-based and fine-tuning approaches (GlossBERT)
Question Answering with a Fine-Tuned BERT
❓Fine-tuning BERT for extractive QA on SQuAD 2.0
Initially implement Document-Retrieval-System with SBERT embeddings and evaluate it in CORD-19 dataset. Afterwards, fine tune BERT model with SQuAD.v2 dataset so as to evaluate it in Question Answering task.
Add a description, image, and links to the fine-tuning-bert topic page so that developers can more easily learn about it.
To associate your repository with the fine-tuning-bert topic, visit your repo's landing page and select "manage topics."