Several machine learning classifiers in Python
-
Updated
Sep 26, 2020 - Python
Several machine learning classifiers in Python
Using CCR to predict piezoresponse force microscopy datasets
Anomaly Detection by Recombining Gated Unsupervised Experts
Faster alternative to Fast Feedforward Layer that uses angular distance for routing
About Code repository for: Nguyen, H., Nguyen, T., Nguyen, K., & Ho, N. (2024). Towards Convergence Rates for Parameter Estimation in Gaussian-gated Mixture of Experts. In Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, AISTATS 2024, Acceptance rate 27.6% over 1980 submissions.
Differentially private retriever using transformer memory as a search index for information retrieval
an LLM toolkit
Review on Google's multitask ranking system by comparing to other methods used in recommender systems
Bayesian Learning for Control in Multimodal Dynamical Systems | written in Org-mode
This instruction aims to reproduce the results in the paper “Mesh-clustered Gaussian process emulator for partial differential equation boundary value problems”(2024) to appear in Technometrics.
Gaussian Process-Gated Hierarchical Mixture of Experts
This is a prototype of a MixtureOfExpert LLM made with pytorch. Currently in developpment, I am testing its capabilities of learning with simple little tests before learning it on large language datasets.
Code, data, and pre-trained models for our EMNLP 2021 paper "Think about it! Improving defeasible reasoning by first modeling the question scenario"
This R package allows the emulation using a mesh-clustered Gaussian process (mcGP) model for partial differential equation (PDE) systems.
This is the repo for the MixKABRN Neural Network (Mixture of Kolmogorov-Arnold Bit Retentive Networks), and an attempt at first adapting it for training on text, and later adjust it for other modalities.
MoE Decoder Transformer implementation with MLX
The idea to create the perfect LLM currently possible came to my mind because I was watching a YouTube on GaLore, the "sequel" to LoRa, and I realized how fucking groundbreaking that tech is. I was daydreaming about pretraining my own model, this (probably impossible to implement) concept is a refined version of that model.
This collaborative framework is designed to harness the power of a Mixture of Experts (MoE) to automate a wide range of software engineering tasks, thereby enhancing code quality and expediting development processes.
Anomaly detection using ARGUE - an advanced mixture-of-experts autoencoder model
Add a description, image, and links to the mixture-of-experts topic page so that developers can more easily learn about it.
To associate your repository with the mixture-of-experts topic, visit your repo's landing page and select "manage topics."