🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
May 29, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
NAACL '24 (Demo) / MlSys @ NeurIPS '23 - RedCoast: A Lightweight Tool to Automate Distributed Training and Inference
The corresponding code for our paper: A sequence-to-sequence approach for document-level relation extraction.
[데이터 분석 캡스톤 디자인] 스마트 워치 사용자 말투 답장 제안 프로젝트
A large collection of Khmer language resources. Khmer is a language used by Cambodia.
Gateway into the John Snow Labs Ecosystem
An elegent pytorch implement of transformers
A study on Knowledge-based question generation from images. Undergraduate Thesis for 2023-2024.
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Code to address Natural Language Generation Tasks via Transformer Architecture
1 line for thousands of State of The Art NLP models in hundreds of languages The fastest and most accurate way to solve text problems.
A desktop application to assist in learning languages. Uses a deep learning model to generate translations.
Lingvo
카카오톡 말투 변환기에 역공학 기법을 적용하여 대화체에 robust한 정중체, 상냥체 변환기 개발
Repository relating to various courses provided by Coursera on NLP
This repository explores the use of advanced sequence-to-sequence networks and transformer models, such as BERT, BART, PEGASUS, and T5, for summarizing multi-text documents in the medical domain. It leverages extensive datasets like CORD-19 and a Biomedical Abstracts dataset from Hugging Face to fine-tune these models.
Seq2SeqSharp is a tensor based fast & flexible deep neural network framework written by .NET (C#). It has many highlighted features, such as automatic differentiation, different network types (Transformer, LSTM, BiLSTM and so on), multi-GPUs supported, cross-platforms (Windows, Linux, x86, x64, ARM), multimodal model for text and images and so on.
A Python package with ready-to-use models for various NLP tasks and text preprocessing utilities. The implementation allows fine-tuning.
Transformers4Rec is a flexible and efficient library for sequential and session-based recommendation and works with PyTorch.
Add a description, image, and links to the seq2seq topic page so that developers can more easily learn about it.
To associate your repository with the seq2seq topic, visit your repo's landing page and select "manage topics."