Skip to main content

Fine-tunning open-source AI models for fun and profit in 2025

Open-source AI tiny models? Self-hosted services? What could go wrong? 😈 A pledge for 2025.

November 21, 2024 - 3 min. read

image-generated-using-flux
Image generated using Flux

Intro.

This 2025 I'm launching a couple of new experiments (plus a payments project) 🤖 ₿ 🚀! Let this be the pledge that holds me accountable.


The first experiment it's called Futurewise. An automated AI training platform, with curated AI tools, news & courses for the LATAM market. Minimal, simple, to the point. A sort of BensBites for low GDP countries.

Of course that could be fairly described as just another shitty newsletter, but in this case, the experiment, (as all experiments should) has a goal.

The goal is to determine whether it is possible for me to automate the whole process of creating & maintaining such a project.
From content, to images, to code, to operations, using AI & Playwright.

The idea is to see the extent in which I can handle this challenge, and to learn more about artificial intelligence (mostly) open-source models in the way.

Also, trying to automate this process will lead me to understanding, using and maybe creating AI agents in 2025. Apart from skill-up my Machine learning, HuggingFace & Python skills.

On the other hand, as CTO of Braaay.com I am learning about wine, driking wine (and craft-beers), a quite lot of them. By result of embracing this commitment to the project, I became acquainted with the subject and the challenges of running a wine e-Commerce & Showroom in São Paulo & Uruguay.

So in 2025 I will be moving the site, from Woocommerce to a lean and clean Svelte + Tailwind front-end (with a headless Woocommerce in the back, to avoid scaring the crap of the now familiarized editors with the admin interface). Additionally I will use that Svelte 5 project to lever a Tauri mobile app, for the store's Fidelity & Cashback Club.

I would prefer Ghost as a CMS, but life's, unfair as it is sometimes, it's about tradeoffs. So there is mine. I will have to learn to live with that!

And... what does all that have to do with LLM's fine-tunning?

Well, at Braaay, we need to elevante the game. The wine market is crowded but growing.

My way to elevante the game is to contribute with technology, plus a lot of deep transcendental breathing, with sparse bike rides.

So in 2025 I will have to learn how to fine-tune an agent (probably a version of Llama, via replicate.com), train it with wine data (already collected from the site), create an Ollama version of it and locally host it, using Cloudflare tunnels and a Dockerized reverse proxy server.

The result will be our own in-house AI wine sommelier. Like buscawine.com on steroids.

I will use that agent to create several things:

I will be posting all stages of both challenges from Futurewise & Braaay below at the experiment changelog. Please (don't) stay tunned, I might just forget to update.

Code for the experiments is already open-sourced at Github here, so feel free to say hi.
As for payments project I will post an update (and maybe the project itself) sometime Q2 of 2025, it's called ViiVPay.

Bike around and be safe!


Changelog.

ai docker ollama experiment

← Back to archive