🎓Automatically Update CV Papers Daily using Github Actions (Update Every 24th hours)
-
Updated
Jun 3, 2024 - Python
🎓Automatically Update CV Papers Daily using Github Actions (Update Every 24th hours)
Personal Project: MPP-Qwen14B(Multimodal Pipeline Parallel-Qwen14B). Don't let the poverty limit your imagination! Train your own 14B LLaVA-like MLLM on RTX3090/4090 24GB.
Saprot: Protein Language Model with Structural Alphabet
Customized Pretraining for NLG Tasks
Code repository for the conference paper "Organoid Segmentation Using Self-Supervised Learning: How Complex Should the Pretext Task Be?" published and presented at the International Conference on Biomedical and Bioinformatics Engineering (ICBBE) 2023.
使用LLaMA-Factory微调多模态大语言模型的示例代码 Demo of Finetuning Multimodal LLM with LLaMA-Factory
飞桨大模型开发套件,提供大语言模型、跨模态大模型、生物计算大模型等领域的全流程开发工具链。
This collection of notebooks is based on the Dive into Deep Learning Book. All of the notes are written in Pytorch and the d2l/torch library
Official Repository for the Uni-Mol Series Methods
Using Pre-training and Interaction Modeling for ancestry-specific disease prediction using multiomics data from the UK Biobank
Llama中文社区,Llama3在线体验和微调模型已开放,实时汇总最新Llama3学习资料,已将所有代码更新适配Llama3,构建最好的中文Llama大模型,完全开源可商用
[NeurIPS2022] Egocentric Video-Language Pretraining
[ICCV2023] UniVTG: Towards Unified Video-Language Temporal Grounding
Official implementation of Matrix Variational Masked Autoencoder (M-MAE) for ICML paper "Information Flow in Self-Supervised Learning" (https://arxiv.org/abs/2309.17281)
Official implementation of ICML 2024 paper "Matrix Information Theory for Self-supervised Learning" (https://arxiv.org/abs/2305.17326)
Taught by AI genius Andrew NG, this course entails the cutting edge topics such as, How generative AI works including what it can and can't do, Common uses cases such as Reading, Writing, and Chatting, Life Cycle of GenAI projects, Advanced Technology options such as RAG, Fine tunning, and Pre-Training, Implications of GenAI on business & Society.
PonderV2: Pave the Way for 3D Foundation Model with A Universal Pre-training Paradigm
Very incomplete right now, pretrained ARGVAET system for generating, classifying, and predicting the properties of molecules. I couldn't upload the dataset or checkpoints due to size constraints.
Official repository of OFA (ICML 2022). Paper: OFA: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework
Add a description, image, and links to the pretraining topic page so that developers can more easily learn about it.
To associate your repository with the pretraining topic, visit your repo's landing page and select "manage topics."