• 7 Posts
  • 124 Comments
Joined 1 year ago
cake
Cake day: July 29th, 2023

help-circle









  • This is correct, I already installed the minio cli, but when I came back and read this, I tried it out and yes, once garage is running in the container, you can

    alias garage="docker exec -ti <container name> /garage"
    

    so you can do the cli things like garage bucket info test-bucket or whatever. The --help for the garage command is pretty great, which is good since they don’t write it up much in the docs.







  • starcoder2:latest       	f67ae0f64584	1.7 GB	3 days ago 	
    phi3:latest             	d184c916657e	2.2 GB	3 weeks ago	
    deepseek-coder-v2:latest	8577f96d693e	8.9 GB	3 weeks ago	
    llama3:8b-instruct-q8_0 	1b8e49cece7f	8.5 GB	3 weeks ago	
    dolphin-mistral:latest  	5dc8c5a2be65	4.1 GB	3 weeks ago	
    codeqwen:latest         	df352abf55b1	4.2 GB	3 weeks ago	
    llama3:latest           	365c0bd3c000	4.7 GB	4 weeks ago
    

    I mostly use starcoder2 with Continue for code autocomplete, the big deepseek coder is a bit slow (I can feel it thinking), but it and the regular llama3 are good for chatbot type programming questions.

    I don’t really have anything to compare the M1 performance to. I guess the 8GB models output text a little slower than the web versions of the same models, and the 4GB ones about the same. Using ollama in the terminal, there’s sometimes a 0.5-2 second pause before it starts outputting. Not with phi3 though - it’s surprisingly snappy for the quality of answers.







  • Yes, a few. Signal (daily use), LetsEncrypt & Certbot (EFF). It’s not enough.

    One day I decided I’d spend $x every January (when I do all my other donations) on open source stuff I depend on, and roughly in the proportions I depend on them. It quickly became impossible - I can’t just fund Debian (which I use a lot of in VMs), I’d need to think of all their dependencies, same with NGINX, Node etc etc. The mind boggles.

    I need something like a Spotify subscription for open source to assuage my guilt of the great value I extract for my personal use of open source.