Skip to content

Notnaton/microllm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Microllm

just the bare basics to run inference on local hardware.

currently working:

  • gguf.py Now it reads the entire gguf file and returns the file locations for the tensor data.

todo:

  • load tensors into model
  • inference

About

My own implementation to run inference on local LLM models

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Languages