Arghh haha I have been messing around with trying to understand why the transactions I see on my Tiller from my credit card, are not as many as what I see when I pull the csv of transactions straight from my credit card website. Dohh. During my data wrangling, I had to manually copy and paste transactions out of my Google Sheets, because Google Sheets oddly enough does not have the capability of “exporting as csv” what has undergone a data filter....
collect logseq logbook stats
Figured, I’m not good at logseq datalog queries yet, so may as well just read logseq LOGBOOK data using plain python. And with assist of chat gpt, I have a nice proof of concept. from pathlib import Path import polars as pl import time_log as tl journals_dir = "mgraphblah/journals" pattern = "2025_*.md" out_vec = tl.iterate_across_journals(journals_dir, pattern) df = pl.from_dicts(out_vec) how much time I spent on taxes? per_tag_stats = df.explode("Tags").group_by("Tags").agg(pl.col("Duration (mins)").sum().alias("Total minutes")) per_tag_stats....
good vibes all around
I was reading Andrej Karpathy’s original post, https://x.com/karpathy/status/1886192184808149383, on vibe coding. And yesterday, first time I tried VS Code, in order to try Codeium, as my first touch of edit assist which the early GitHub Copilot I test drove before did not have. Cursor introduced the edit concept but I hadn’t tried this because I was already paying for Open AI ChatGPT, so I didnt want to double dip. But I got a chance to try Codeium at work, so I did....
batch a few book summaries
Drafting this here so far, before it leaves my brain. In, Feel Good Productivity, Ali Abdaal captured very well, the idea that sometimes what we do with the intention of winding down at the end of the day, does not actually achieve that purpose . and instead he challenges his readers, to paint, go for a walk, or to find the things that, really give you the relaxation that you actually want....
transformer architecture sweet spot
(DRAFT for now ) What is the transformer architecture? Let me try for a, hopefully a sweet spot explanation. A deep neural network, trained by back propagation, with language data, first by self supervised learning (aka pre-training) using Masked Language Modeling, and then by fine tuning, for tasks like text summarization, part of speech labeling, Name Entity Recognition labeling, question answering, translation, and others. Self supervision, by way of next token prediction or more generally masked language modeling , lets a model to be trained without human generated labels....